Sentence-Level Content Planning and Style Specification for Neural Text Generation

09/02/2019
by   Xinyu Hua, et al.
0

Building effective text generation systems requires three critical components: content selection, text planning, and surface realization, and traditionally they are tackled as separate problems. Recent all-in-one style neural generation models have made impressive progress, yet they often produce outputs that are incoherent and unfaithful to the input. To address these issues, we present an end-to-end trained two-step generation model, where a sentence-level content planner first decides on the keyphrases to cover as well as a desired language style, followed by a surface realization decoder that generates relevant and coherent text. For experiments, we consider three tasks from domains with diverse topics and varying language styles: persuasive argument construction from Reddit, paragraph generation for normal and simple versions of Wikipedia, and abstract generation for scientific articles. Automatic evaluation shows that our system can significantly outperform competitive comparisons. Human judges further rate our system generated text as more fluent and correct, compared to the generations by its variants that do not consider language style.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/01/2021

DYPLOC: Dynamic Planning of Content Using Mixed Language Models for Text Generation

We study the task of long-form opinion text generation, which faces at l...
research
06/09/2019

Argument Generation with Retrieval, Planning, and Realization

Automatic argument generation is an appealing but challenging task. In t...
research
02/04/2021

Data-to-text Generation with Macro Planning

Recent approaches to data-to-text generation have adopted the very succe...
research
05/24/2019

Designing a Symbolic Intermediate Representation for Neural Surface Realization

Generated output from neural NLG systems often contain errors such as ha...
research
10/05/2020

PAIR: Planning and Iterative Refinement in Pre-trained Transformers for Long Text Generation

Pre-trained Transformers have enabled impressive breakthroughs in genera...
research
08/06/2021

Sentence Semantic Regression for Text Generation

Recall the classical text generation works, the generation framework can...
research
05/05/2023

Stylized Data-to-Text Generation: A Case Study in the E-Commerce Domain

Existing data-to-text generation efforts mainly focus on generating a co...

Please sign up or login with your details

Forgot password? Click here to reset