Neural AMR: Sequence-to-Sequence Models for Parsing and Generation

04/26/2017
by   Ioannis Konstas, et al.
0

Sequence-to-sequence models have shown strong performance across a broad range of applications. However, their application to parsing and generating text usingAbstract Meaning Representation (AMR)has been limited, due to the relatively limited amount of labeled data and the non-sequential nature of the AMR graphs. We present a novel training procedure that can lift this limitation using millions of unlabeled sentences and careful preprocessing of the AMR graphs. For AMR parsing, our model achieves competitive results of 62.1SMATCH, the current best score reported without significant use of external semantic resources. For AMR generation, our model establishes a new state-of-the-art performance of BLEU 33.8. We present extensive ablative and qualitative analysis including strong evidence that sequence-based AMR models are robust against ordering variations of graph-to-sequence conversions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/04/2018

Sequence-to-Action: End-to-End Semantic Graph Generation for Semantic Parsing

This paper proposes a neural semantic parsing approach -- Sequence-to-Ac...
research
10/24/2022

Structural generalization is hard for sequence-to-sequence models

Sequence-to-sequence (seq2seq) models have been successful across many N...
research
02/16/2017

Addressing the Data Sparsity Issue in Neural AMR Parsing

Neural attention models have achieved great success in different NLP tas...
research
09/30/2019

Semantic Graph Parsing with Recurrent Neural Network DAG Grammars

Semantic parses are directed acyclic graphs (DAGs), so semantic parsing ...
research
01/01/2023

Semantic Operator Prediction and Applications

In the present paper, semantic parsing challenges are briefly introduced...
research
04/20/2018

Generating syntactically varied realisations from AMR graphs

Generating from Abstract Meaning Representation (AMR) is an underspecifi...
research
02/05/2023

Unleashing the True Potential of Sequence-to-Sequence Models for Sequence Tagging and Structure Parsing

Sequence-to-Sequence (S2S) models have achieved remarkable success on va...

Please sign up or login with your details

Forgot password? Click here to reset