Can Very Large Pretrained Language Models Learn Storytelling With A Few Examples?

01/24/2023
by   Zhuohan Xie, et al.
0

While pre-trained language models can generate individually fluent sentences for automatic story generation, they struggle to generate stories that are coherent, sensible and interesting. Current state-of-the-art (SOTA) story generation models explore using higher-level features such as plots or commonsense knowledge to improve the quality of generated stories. Prompt-based learning using very large pre-trained language models (VLPLMs) such as GPT3 has demonstrated impressive performance even across various NLP tasks. In this paper, we present an extensive study using automatic and human evaluation to compare the story generation capability of VLPLMs to those SOTA models in three different datasets where stories differ in style, register and length. Our results show that VLPLMs generate much higher quality stories than other story generation models, and to a certain extent rival human authors, although preliminary investigation also reveals that they tend to “plagiarise” real stories in scenarios that involve world knowledge.

READ FULL TEXT

page 28

page 29

research
12/16/2022

Neural Story Planning

Automated plot generation is the challenge of generating a sequence of e...
research
05/03/2023

Can Large Language Models Be an Alternative to Human Evaluations?

Human evaluation is indispensable and inevitable for assessing the quali...
research
10/02/2020

MEGATRON-CNTRL: Controllable Story Generation with External Knowledge Using Large-Scale Language Models

Existing pre-trained large language models have shown unparalleled gener...
research
06/07/2023

World Models for Math Story Problems

Solving math story problems is a complex task for students and NLP model...
research
12/21/2022

CORRPUS: Detecting Story Inconsistencies via Codex-Bootstrapped Neurosymbolic Reasoning

Story generation and understanding – as with all NLG/NLU tasks – has see...
research
06/04/2021

COINS: Dynamically Generating COntextualized Inference Rules for Narrative Story Completion

Despite recent successes of large pre-trained language models in solving...
research
09/27/2018

Controllable Neural Story Generation via Reinforcement Learning

Open story generation is the problem of automatically creating a story f...

Please sign up or login with your details

Forgot password? Click here to reset