Inferring the Reader: Guiding Automated Story Generation with Commonsense Reasoning

05/04/2021
by   Xiangyu Peng, et al.
0

Transformer-based language model approaches to automated story generation currently provide state-of-the-art results. However, they still suffer from plot incoherence when generating narratives over time, and critically lack basic commonsense reasoning. Furthermore, existing methods generally focus only on single-character stories, or fail to track characters at all. To improve the coherence of generated narratives and to expand the scope of character-centric narrative generation, we introduce Commonsense-inference Augmented neural StoryTelling (CAST), a framework for introducing commonsense reasoning into the generation process while modeling the interaction between multiple characters. We find that our CAST method produces significantly more coherent and on-topic two-character stories, outperforming baselines in dimensions including plot plausibility and staying on topic. We also show how the CAST method can be used to further train language models that generate more coherent stories and reduce computation cost.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/16/2021

Guiding Neural Story Generation with Reader Models

Automated storytelling has long captured the attention of researchers fo...
research
09/02/2020

Automated Storytelling via Causal, Commonsense Plot Ordering

Automated story plot generation is the task of generating a coherent seq...
research
01/18/2022

Inferring Commonsense Explanations as Prompts for Future Event Generation

Future Event Generation aims to generate fluent and reasonable future ev...
research
02/10/2023

Adversarial Transformer Language Models for Contextual Commonsense Inference

Contextualized or discourse aware commonsense inference is the task of g...
research
01/15/2020

A Knowledge-Enhanced Pretraining Model for Commonsense Story Generation

Story generation, namely generating a reasonable story from a leading co...
research
01/02/2021

On-the-Fly Attention Modularization for Neural Generation

Despite considerable advancements with deep neural language models (LMs)...
research
09/16/2022

Quantifying Discourse Support for Omitted Pronouns

Pro-drop is commonly seen in many languages, but its discourse motivatio...

Please sign up or login with your details

Forgot password? Click here to reset