PLANET: Dynamic Content Planning in Autoregressive Transformers for Long-form Text Generation

03/17/2022
by   Zhe Hu, et al.
0

Despite recent progress of pre-trained language models on generating fluent text, existing methods still suffer from incoherence problems in long-form text generation tasks that require proper content control and planning to form a coherent high-level logical flow. In this work, we propose PLANET, a novel generation framework leveraging autoregressive self-attention mechanism to conduct content planning and surface realization dynamically. To guide the generation of output sentences, our framework enriches the Transformer decoder with latent representations to maintain sentence-level semantic plans grounded by bag-of-words. Moreover, we introduce a new coherence-based contrastive learning objective to further improve the coherence of output. Extensive experiments are conducted on two challenging long-form text generation tasks including counterargument generation and opinion article generation. Both automatic and human evaluations show that our method significantly outperforms strong baselines and generates more coherent texts with richer contents.

READ FULL TEXT
research
06/01/2021

DYPLOC: Dynamic Planning of Content Using Mixed Language Models for Text Generation

We study the task of long-form opinion text generation, which faces at l...
research
05/19/2021

Long Text Generation by Modeling Sentence-Level and Discourse-Level Coherence

Generating long and coherent text is an important but challenging task, ...
research
10/05/2020

PAIR: Planning and Iterative Refinement in Pre-trained Transformers for Long Text Generation

Pre-trained Transformers have enabled impressive breakthroughs in genera...
research
04/17/2020

Rigid Formats Controlled Text Generation

Neural text generation has made tremendous progress in various tasks. On...
research
10/14/2020

Summarize, Outline, and Elaborate: Long-Text Generation via Hierarchical Supervision from Extractive Summaries

Long-text generation remains a challenge. The difficulty of generating c...
research
08/08/2022

Generating Coherent Narratives by Learning Dynamic and Discrete Entity States with a Contrastive Framework

Despite advances in generating fluent texts, existing pretraining models...
research
05/03/2022

Zero-shot Sonnet Generation with Discourse-level Planning and Aesthetics Features

Poetry generation, and creative language generation in general, usually ...

Please sign up or login with your details

Forgot password? Click here to reset