Narrative Text Generation with a Latent Discrete Plan

10/07/2020
by   Harsh Jhamtani, et al.
6

Past work on story generation has demonstrated the usefulness of conditioning on a generation plan to generate coherent stories. However, these approaches have used heuristics or off-the-shelf models to first tag training stories with the desired type of plan, and then train generation models in a supervised fashion. In this paper, we propose a deep latent variable model that first samples a sequence of anchor words, one per sentence in the story, as part of its generative process. During training, our model treats the sequence of anchor words as a latent variable and attempts to induce anchoring sequences that help guide generation in an unsupervised fashion. We conduct experiments with several types of sentence decoder distributions: left-to-right and non-monotonic, with different degrees of restriction. Further, since we use amortized variational inference to train our model, we introduce two corresponding types of inference network for predicting the posterior on anchor words. We conduct human evaluations which demonstrate that the stories produced by our model are rated better in comparison with baselines which do not consider story plans, and are similar or better in quality relative to baselines which use external supervision for plans. Additionally, the proposed model gets favorable scores when evaluated on perplexity, diversity, and control of story via discrete plan.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/01/2022

Towards Inter-character Relationship-driven Story Generation

In this paper, we introduce the task of modeling interpersonal relations...
research
12/20/2022

Little Red Riding Hood Goes Around the Globe:Crosslingual Story Planning and Generation with Large Language Models

We consider the problem of automatically generating stories in multiple ...
research
11/14/2018

Plan-And-Write: Towards Better Automatic Storytelling

Automatic storytelling is challenging since it requires generating long,...
research
11/15/2021

Exploring Story Generation with Multi-task Objectives in Variational Autoencoders

GPT-2 has been frequently adapted in story generation models as it provi...
research
01/07/2020

Paraphrase Generation with Latent Bag of Words

Paraphrase generation is a longstanding important problem in natural lan...
research
09/14/2021

A Temporal Variational Model for Story Generation

Recent language models can generate interesting and grammatically correc...
research
08/17/2020

Narrative Interpolation for Generating and Understanding Stories

We propose a method for controlled narrative/story generation where we a...

Please sign up or login with your details

Forgot password? Click here to reset