Event Transition Planning for Open-ended Text Generation

04/20/2022
by   Qintong Li, et al.
2

Open-ended text generation tasks, such as dialogue generation and story completion, require models to generate a coherent continuation given limited preceding context. The open-ended nature of these tasks brings new challenges to the neural auto-regressive text generators nowadays. Despite these neural models are good at producing human-like text, it is difficult for them to arrange causalities and relations between given facts and possible ensuing events. To bridge this gap, we propose a novel two-stage method which explicitly arranges the ensuing events in open-ended text generation. Our approach can be understood as a specially-trained coarse-to-fine algorithm, where an event transition planner provides a "coarse" plot skeleton and a text generator in the second stage refines the skeleton. Experiments on two open-ended text generation tasks demonstrate that our proposed method effectively improves the quality of the generated text, especially in coherence and diversity. The code is available at: <https://github.com/qtli/EventPlanforTextGen>.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/07/2022

Visualize Before You Write: Imagination-Guided Open-Ended Text Generation

Recent advances in text-to-image synthesis make it possible to visualize...
research
04/06/2020

Sparse Text Generation

Current state-of-the-art text generators build on powerful language mode...
research
05/03/2022

Learning to Transfer Prompts for Text Generation

Pretrained language models (PLMs) have made remarkable progress in text ...
research
01/06/2021

TextBox: A Unified, Modularized, and Extensible Framework for Text Generation

We release an open library, called TextBox, which provides a unified, mo...
research
09/14/2021

The Perils of Using Mechanical Turk to Evaluate Open-Ended Text Generation

Recent text generation research has increasingly focused on open-ended d...
research
02/14/2023

The Stable Entropy Hypothesis and Entropy-Aware Decoding: An Analysis and Algorithm for Robust Natural Language Generation

State-of-the-art language generation models can degenerate when applied ...
research
05/04/2020

Improving Adversarial Text Generation by Modeling the Distant Future

Auto-regressive text generation models usually focus on local fluency, a...

Please sign up or login with your details

Forgot password? Click here to reset