MEGATRON-CNTRL: Controllable Story Generation with External Knowledge Using Large-Scale Language Models

by   Peng Xu, et al.

Existing pre-trained large language models have shown unparalleled generative capabilities. However, they are not controllable. In this paper, we propose MEGATRON-CNTRL, a novel framework that uses large-scale language models and adds control to text generation by incorporating an external knowledge base. Our framework consists of a keyword predictor, a knowledge retriever, a contextual knowledge ranker, and a conditional text generator. As we do not have access to ground-truth supervision for the knowledge ranker, we make use of weak supervision from sentence embedding. The empirical results show that our model generates more fluent, consistent, and coherent stories with less repetition and higher diversity compared to prior work on the ROC story dataset. We showcase the controllability of our model by replacing the keywords used to generate stories and re-running the generation process. Human evaluation results show that 77.5 by the new keywords. Furthermore, by scaling our model from 124 million to 8.3 billion parameters we demonstrate that larger models improve both the quality of generation (from 74.5 77.5


page 1

page 2

page 3

page 4


Outline to Story: Fine-grained Controllable Story Generation from Cascaded Events

Large-scale pretrained language models have shown thrilling generation c...

Controllable Text Generation for Open-Domain Creativity and Fairness

Recent advances in large pre-trained language models have demonstrated s...

Controllable Generation from Pre-trained Language Models via Inverse Prompting

Large-scale pre-trained language models have demonstrated strong capabil...

Plot Writing From Pre-Trained Language Models

Pre-trained language models (PLMs) fail to generate long-form narrative ...

Controllable and Diverse Text Generation in E-commerce

In E-commerce, a key challenge in text generation is to find a good trad...

Template Controllable keywords-to-text Generation

This paper proposes a novel neural model for the understudied task of ge...

PowerTransformer: Unsupervised Controllable Revision for Biased Language Correction

Unconscious biases continue to be prevalent in modern text and media, ca...