-
Structural Information Preserving for Graph-to-Text Generation
The task of graph-to-text generation aims at producing sentences that pr...
read it
-
A Graph-to-Sequence Model for AMR-to-Text Generation
The problem of AMR-to-text generation is to recover a text representing ...
read it
-
Pragmatically Informative Text Generation
We improve the informativeness of models for conditional text generation...
read it
-
Generación automática de frases literarias en español
In this work we present a state of the art in the area of Computational ...
read it
-
Neural Academic Paper Generation
In this work, we tackle the problem of structured text generation, speci...
read it
-
Enhancing AMR-to-Text Generation with Dual Graph Representations
Generating text from graph-based data, such as Abstract Meaning Represen...
read it
-
PatentTransformer-2: Controlling Patent Text Generation by Structural Metadata
PatentTransformer is our codename for patent text generation based on Tr...
read it
Online Back-Parsing for AMR-to-Text Generation
AMR-to-text generation aims to recover a text containing the same meaning as an input AMR graph. Current research develops increasingly powerful graph encoders to better represent AMR graphs, with decoders based on standard language modeling being used to generate outputs. We propose a decoder that back predicts projected AMR graphs on the target sentence during text generation. As the result, our outputs can better preserve the input meaning than standard decoders. Experiments on two AMR benchmarks show the superiority of our model over the previous state-of-the-art system based on graph Transformer.
READ FULL TEXT
Comments
There are no comments yet.