Generating syntactically varied realisations from AMR graphs

04/20/2018
by   Kris Cao, et al.
0

Generating from Abstract Meaning Representation (AMR) is an underspecified problem, as many syntactic decisions are not specified by the semantic graph. We learn a sequence-to-sequence model that generates possible constituency trees for an AMR graph, and then train another model to generate text realisations conditioned on both an AMR graph and a constituency tree. We show that factorising the model this way lets us effectively use parse information, obtaining competitive BLEU scores on self-generated parses and impressive BLEU scores with oracle parses. We also demonstrate that we can generate meaning-preserving syntactic paraphrases of the same AMR graph.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/27/2019

Structural Neural Encoders for AMR-to-text Generation

AMR-to-text generation is a problem recently introduced to the NLP commu...
research
07/24/2017

Transition-Based Generation from Abstract Meaning Representations

This work addresses the task of generating English sentences from Abstra...
research
04/26/2017

Neural AMR: Sequence-to-Sequence Models for Parsing and Generation

Sequence-to-sequence models have shown strong performance across a broad...
research
11/18/2019

Graph Transformer for Graph-to-Sequence Learning

The dominant graph-to-sequence transduction models employ graph neural n...
research
03/24/2022

SMARAGD: Synthesized sMatch for Accurate and Rapid AMR Graph Distance

The semantic similarity of graph-based meaning representations, such as ...
research
04/25/2019

Neural Text Generation from Rich Semantic Representations

We propose neural models to generate high-quality text from structured r...
research
05/05/2020

Smart To-Do : Automatic Generation of To-Do Items from Emails

Intelligent features in email service applications aim to increase produ...

Please sign up or login with your details

Forgot password? Click here to reset