DeepAI AI Chat
Log In Sign Up

WriterForcing: Generating more interesting story endings

by   Prakhar Gupta, et al.
Carnegie Mellon University

We study the problem of generating interesting endings for stories. Neural generative models have shown promising results for various text generation problems. Sequence to Sequence (Seq2Seq) models are typically trained to generate a single output sequence for a given input sequence. However, in the context of a story, multiple endings are possible. Seq2Seq models tend to ignore the context and generate generic and dull responses. Very few works have studied generating diverse and interesting story endings for a given story context. In this paper, we propose models which generate more diverse and interesting outputs by 1) training models to focus attention on important keyphrases of the story, and 2) promoting generation of non-generic words. We show that the combination of the two leads to more diverse and interesting endings.


page 1

page 2

page 3

page 4


Using Inter-Sentence Diverse Beam Search to Reduce Redundancy in Visual Storytelling

Visual storytelling includes two important parts: coherence between the ...

Knowledge-Enriched Visual Storytelling

Stories are diverse and highly personalized, resulting in a large possib...

Generating Diverse Story Continuations with Controllable Semantics

We propose a simple and effective modeling framework for controlled gene...

"My Way of Telling a Story": Persona based Grounded Story Generation

Visual storytelling is the task of generating stories based on a sequenc...

Sort Story: Sorting Jumbled Images and Captions into Stories

Temporal common sense has applications in AI tasks such as QA, multi-doc...

TopNet: Learning from Neural Topic Model to Generate Long Stories

Long story generation (LSG) is one of the coveted goals in natural langu...

Myths and Legends of the Baldwin Effect

This position paper argues that the Baldwin effect is widely misundersto...