GPT-too: A language-model-first approach for AMR-to-text generation

05/18/2020
by   Manuel Mager, et al.
0

Abstract Meaning Representations (AMRs) are broad-coverage sentence-level semantic graphs. Existing approaches to generating text from AMR have focused on training sequence-to-sequence or graph-to-sequence models on AMR annotated data only. In this paper, we propose an alternative approach that combines a strong pre-trained language model with cycle consistency-based re-scoring. Despite the simplicity of the approach, our experimental results show these models outperform all previous techniques on the English LDC2017T10dataset, including the recent use of transformer architectures. In addition to the standard evaluation metrics, we provide human evaluation experiments that further substantiate the strength of our approach.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/21/2021

TrOCR: Transformer-based Optical Character Recognition with Pre-trained Models

Text recognition is a long-standing research problem for document digita...
research
08/31/2019

Modeling Graph Structure in Transformer for Better AMR-to-Text Generation

Recent studies on AMR-to-text generation often formalize the task as a s...
research
06/15/2022

A Survey : Neural Networks for AMR-to-Text

AMR-to-text is one of the key techniques in the NLP community that aims ...
research
11/10/2019

Distilling the Knowledge of BERT for Text Generation

Large-scale pre-trained language model, such as BERT, has recently achie...
research
08/21/2023

Can Language Models Learn to Listen?

We present a framework for generating appropriate facial responses from ...
research
04/15/2019

Pun Generation with Surprise

We tackle the problem of generating a pun sentence given a pair of homop...
research
04/04/2019

Complexity-Weighted Loss and Diverse Reranking for Sentence Simplification

Sentence simplification is the task of rewriting texts so they are easie...

Please sign up or login with your details

Forgot password? Click here to reset