A Graph-to-Sequence Model for AMR-to-Text Generation

05/07/2018
by   Linfeng Song, et al.
0

The problem of AMR-to-text generation is to recover a text representing the same meaning as an input AMR graph. The current state-of-the-art method uses a sequence-to-sequence model, leveraging LSTM for encoding a linearized AMR structure. Although being able to model non-local semantic information, a sequence LSTM can lose information from the AMR graph structure, and thus faces challenges with large graphs, which result in long sequences. We introduce a neural graph-to-sequence model, using a novel LSTM structure for directly encoding graph-level semantics. On a standard benchmark, our model shows superior results to existing methods in the literature.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/23/2018

Deep Graph Convolutional Encoders for Structured Data to Text Generation

Most previous work on neural text generation from graph-structured data ...
research
12/03/2019

AMR-to-Text Generation with Cache Transition Systems

Text generation from AMR involves emitting sentences that reflect the me...
research
10/09/2020

Online Back-Parsing for AMR-to-Text Generation

AMR-to-text generation aims to recover a text containing the same meanin...
research
10/02/2019

Clinical Text Generation through Leveraging Medical Concept and Relations

With a neural sequence generation model, this study aims to develop a me...
research
10/09/2020

Lightweight, Dynamic Graph Convolutional Networks for AMR-to-Text Generation

AMR-to-text generation is used to transduce Abstract Meaning Representat...
research
07/12/2020

Sparse Graph to Sequence Learning for Vision Conditioned Long Textual Sequence Generation

Generating longer textual sequences when conditioned on the visual infor...
research
02/12/2021

Structural Information Preserving for Graph-to-Text Generation

The task of graph-to-text generation aims at producing sentences that pr...

Please sign up or login with your details

Forgot password? Click here to reset