Plot Writing From Pre-Trained Language Models

06/07/2022
by   Yiping Jin, et al.
0

Pre-trained language models (PLMs) fail to generate long-form narrative text because they do not consider global structure. As a result, the generated texts are often incohesive, repetitive, or lack content. Recent work in story generation reintroduced explicit content planning in the form of prompts, keywords, or semantic frames. Trained on large parallel corpora, these models can generate more logical event sequences and thus more contentful stories. However, these intermediate representations are often not in natural language and cannot be utilized by PLMs without fine-tuning. We propose generating story plots using off-the-shelf PLMs while maintaining the benefit of content planning to generate cohesive and contentful stories. Our proposed method, ScratchPlot, first prompts a PLM to compose a content plan. Then, we generate the story's body and ending conditioned on the content plan. Furthermore, we take a generate-and-rank approach by using additional PLMs to rank the generated (story, ending) pairs. We benchmark our method with various baselines and achieved superior results in both human and automatic evaluation.

READ FULL TEXT

page 14

page 15

research
09/21/2020

Content Planning for Neural Story Generation with Aristotelian Rescoring

Long-form narrative text generated from large language models manages a ...
research
12/09/2022

Plug-and-Play Recipe Generation with Content Planning

Recent pre-trained language models have shown promising capabilities in ...
research
06/07/2023

World Models for Math Story Problems

Solving math story problems is a complex task for students and NLP model...
research
11/03/2020

Modeling Event Salience in Narratives via Barthes' Cardinal Functions

Events in a narrative differ in salience: some are more important to the...
research
08/29/2017

Generating Sentence Planning Variations for Story Telling

There has been a recent explosion in applications for dialogue interacti...
research
12/20/2022

Future Sight: Dynamic Story Generation with Large Pretrained Language Models

Recent advances in deep learning research, such as transformers, have bo...
research
09/14/2021

A Temporal Variational Model for Story Generation

Recent language models can generate interesting and grammatically correc...

Please sign up or login with your details

Forgot password? Click here to reset