Online Back-Parsing for AMR-to-Text Generation

10/09/2020
by   Xuefeng Bai, et al.
0

AMR-to-text generation aims to recover a text containing the same meaning as an input AMR graph. Current research develops increasingly powerful graph encoders to better represent AMR graphs, with decoders based on standard language modeling being used to generate outputs. We propose a decoder that back predicts projected AMR graphs on the target sentence during text generation. As the result, our outputs can better preserve the input meaning than standard decoders. Experiments on two AMR benchmarks show the superiority of our model over the previous state-of-the-art system based on graph Transformer.

READ FULL TEXT

page 2

page 4

page 5

page 6

page 7

page 8

page 10

page 11

research
02/12/2021

Structural Information Preserving for Graph-to-Text Generation

The task of graph-to-text generation aims at producing sentences that pr...
research
05/07/2018

A Graph-to-Sequence Model for AMR-to-Text Generation

The problem of AMR-to-text generation is to recover a text representing ...
research
02/12/2023

Investigating the Effect of Relative Positional Embeddings on AMR-to-Text Generation with Structural Adapters

Text generation from Abstract Meaning Representation (AMR) has substanti...
research
04/02/2019

Pragmatically Informative Text Generation

We improve the informativeness of models for conditional text generation...
research
09/01/2019

Enhancing AMR-to-Text Generation with Dual Graph Representations

Generating text from graph-based data, such as Abstract Meaning Represen...
research
08/06/2021

Sentence Semantic Regression for Text Generation

Recall the classical text generation works, the generation framework can...
research
01/11/2020

PatentTransformer-2: Controlling Patent Text Generation by Structural Metadata

PatentTransformer is our codename for patent text generation based on Tr...

Please sign up or login with your details

Forgot password? Click here to reset