Data-to-text Generation with Macro Planning

02/04/2021
by   Ratish Puduppully, et al.
0

Recent approaches to data-to-text generation have adopted the very successful encoder-decoder architecture or variants thereof. These models generate text which is fluent (but often imprecise) and perform quite poorly at selecting appropriate content and ordering it coherently. To overcome some of these issues, we propose a neural model with a macro planning stage followed by a generation stage reminiscent of traditional methods which embrace separate modules for planning and surface realization. Macro plans represent high level organization of important content such as entities, events and their interactions; they are learnt from data and given as input to the generator. Extensive experiments on two data-to-text benchmarks (RotoWire and MLB) show that our approach outperforms competitive baselines in terms of automatic and human evaluation.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/06/2019

Step-by-Step: Separating Planning from Realization in Neural Data-to-Text Generation

Data-to-text generation can be conceptually divided into two parts: orde...
research
09/02/2019

Sentence-Level Content Planning and Style Specification for Neural Text Generation

Building effective text generation systems requires three critical compo...
research
09/22/2019

Improving Quality and Efficiency in Plan-based Neural Data-to-Text Generation

We follow the step-by-step approach to neural data-to-text generation we...
research
07/23/2019

Learning to Select, Track, and Generate for Data-to-Text

We propose a data-to-text generation model with two modules, one for tra...
research
02/28/2022

Data-to-text Generation with Variational Sequential Planning

We consider the task of data-to-text generation, which aims to create te...
research
11/21/2019

Automatically Generating Macro Research Reports from a Piece of News

Automatically generating macro research reports from economic news is an...
research
11/10/2018

Adversarially-Trained Normalized Noisy-Feature Auto-Encoder for Text Generation

This article proposes Adversarially-Trained Normalized Noisy-Feature Aut...

Please sign up or login with your details

Forgot password? Click here to reset