A Knowledge-Enhanced Pretraining Model for Commonsense Story Generation

01/15/2020
by   Jian Guan, et al.
0

Story generation, namely generating a reasonable story from a leading context, is an important but challenging task. In spite of the success in modeling fluency and local coherence, existing neural language generation models (e.g., GPT-2) still suffer from repetition, logic conflicts, and lack of long-range coherence in generated stories. We conjecture that this is because of the difficulty of associating relevant commonsense knowledge, understanding the causal relationships, and planning entities and events with proper temporal order. In this paper, we devise a knowledge-enhanced pretraining model for commonsense story generation. We propose to utilize commonsense knowledge from external knowledge bases to generate reasonable stories. To further capture the causal and temporal dependencies between the sentences in a reasonable story, we employ multi-task learning which combines a discriminative objective to distinguish true and fake stories during fine-tuning. Automatic and manual evaluation shows that our model can generate more reasonable stories than state-of-the-art baselines, particularly in terms of logic and global coherence.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/30/2018

Story Ending Generation with Incremental Encoding and Commonsense Knowledge

Story ending generation is a strong indication of story comprehension. T...
research
04/06/2016

A Corpus and Evaluation Framework for Deeper Understanding of Commonsense Stories

Representation and learning of commonsense knowledge is one of the found...
research
12/13/2018

Find a Reasonable Ending for Stories: Does Logic Relation Help the Story Cloze Test?

Natural language understanding is a challenging problem that covers a wi...
research
01/29/2022

Incorporating Commonsense Knowledge into Story Ending Generation via Heterogeneous Graph Networks

Story ending generation is an interesting and challenging task, which ai...
research
10/31/2018

Picking Apart Story Salads

During natural disasters and conflicts, information about what happened ...
research
05/04/2021

Inferring the Reader: Guiding Automated Story Generation with Commonsense Reasoning

Transformer-based language model approaches to automated story generatio...
research
06/04/2021

COINS: Dynamically Generating COntextualized Inference Rules for Narrative Story Completion

Despite recent successes of large pre-trained language models in solving...

Please sign up or login with your details

Forgot password? Click here to reset