Sequentially Controlled Text Generation

01/05/2023
by   Alexander Spangher, et al.
0

While GPT-2 generates sentences that are remarkably human-like, longer documents can ramble and do not follow human-like writing structure. We study the problem of imposing structure on long-range text. We propose a novel controlled text generation task, sequentially controlled text generation, and identify a dataset, NewsDiscourse as a starting point for this task. We develop a sequential controlled text generation pipeline with generation and editing. We test different degrees of structural awareness and show that, in general, more structural awareness results in higher control-accuracy, grammaticality, coherency and topicality, approaching human-level writing performance.

READ FULL TEXT

page 2

page 4

research
10/24/2020

Text Editing by Command

A prevailing paradigm in neural text generation is one-shot generation, ...
research
07/23/2019

Learning to Select, Track, and Generate for Data-to-Text

We propose a data-to-text generation model with two modules, one for tra...
research
08/15/2023

Teach LLMs to Personalize – An Approach inspired by Writing Education

Personalized text generation is an emerging research area that has attra...
research
05/20/2020

Creative Artificial Intelligence – Algorithms vs. humans in an incentivized writing competition

The release of openly available, robust text generation algorithms has s...
research
10/26/2022

SentBS: Sentence-level Beam Search for Controllable Summarization

A wide range of control perspectives have been explored in controllable ...
research
12/15/2020

Writing Polishment with Simile: Task, Dataset and A Neural Approach

A simile is a figure of speech that directly makes a comparison, showing...
research
12/13/2021

Controlled Cue Generation for Play Scripts

In this paper, we use a large-scale play scripts dataset to propose the ...

Please sign up or login with your details

Forgot password? Click here to reset