Strategies for Structuring Story Generation

02/04/2019
by   Angela Fan, et al.
0

Writers generally rely on plans or sketches to write long stories, but most current language models generate word by word from left to right. We explore coarse-to-fine models for creating narrative texts of several hundred words, and introduce new models which decompose stories by abstracting over actions and entities. The model first generates the predicate-argument structure of the text, where different mentions of the same entity are marked with placeholder tokens. It then generates a surface realization of the predicate-argument structure, and finally replaces the entity placeholders with context-sensitive names and references. Human judges prefer the stories from our models to a wide range of previous approaches to hierarchical text generation. Extensive analysis shows that our methods can help improve the diversity and coherence of events and entities in generated stories.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/14/2018

Plan-And-Write: Towards Better Automatic Storytelling

Automatic storytelling is challenging since it requires generating long,...
research
10/19/2022

DALLE-2 is Seeing Double: Flaws in Word-to-Concept Mapping in Text2Image Models

We study the way DALLE-2 maps symbols (words) in the prompt to their ref...
research
05/25/2022

RSTGen: Imbuing Fine-Grained Interpretable Control into Long-FormText Generators

In this paper, we study the task of improving the cohesion and coherence...
research
09/04/2019

Referring Expression Generation Using Entity Profiles

Referring Expression Generation (REG) is the task of generating contextu...
research
08/07/2023

Storyfier: Exploring Vocabulary Learning Support with Text Generation Models

Vocabulary learning support tools have widely exploited existing materia...
research
06/18/2022

Argumentative Text Generation in Economic Domain

The development of large and super-large language models, such as GPT-3,...
research
02/08/2020

Blank Language Models

We propose Blank Language Model (BLM), a model that generates sequences ...

Please sign up or login with your details

Forgot password? Click here to reset