Topical Language Generation using Transformers

03/11/2021
by   Rohola Zandie, et al.
7

Large-scale transformer-based language models (LMs) demonstrate impressive capabilities in open text generation. However, controlling the generated text's properties such as the topic, style, and sentiment is challenging and often requires significant changes to the model architecture or retraining and fine-tuning the model on new supervised data. This paper presents a novel approach for Topical Language Generation (TLG) by combining a pre-trained LM with topic modeling information. We cast the problem using Bayesian probability formulation with topic probabilities as a prior, LM probabilities as the likelihood, and topical language generation probability as the posterior. In learning the model, we derive the topic probability distribution from the user-provided document's natural structure. Furthermore, we extend our model by introducing new parameters and functions to influence the quantity of the topical features presented in the generated text. This feature would allow us to easily control the topical properties of the generated text. Our experimental results demonstrate that our model outperforms the state-of-the-art results on coherency, diversity, and fluency while being faster in decoding.

READ FULL TEXT

page 1

page 17

page 19

page 20

page 21

research
12/04/2019

Plug and Play Language Models: a Simple Approach to Controlled Text Generation

Large transformer-based language models (LMs) trained on huge text corpo...
research
09/20/2021

A Plug-and-Play Method for Controlled Text Generation

Large pre-trained language models have repeatedly shown their ability to...
research
06/05/2020

CoCon: A Self-Supervised Approach for Controlled Text Generation

Pretrained Transformer-based language models (LMs) display remarkable na...
research
09/11/2019

CTRL: A Conditional Transformer Language Model for Controllable Generation

Large-scale language models show promising text generation capabilities,...
research
11/08/2020

Adapting a Language Model for Controlled Affective Text Generation

Human use language not just to convey information but also to express th...
research
07/29/2023

Towards Codable Text Watermarking for Large Language Models

As large language models (LLMs) generate texts with increasing fluency a...
research
03/29/2021

Changing the Mind of Transformers for Topically-Controllable Language Generation

Large Transformer-based language models can aid human authors by suggest...

Please sign up or login with your details

Forgot password? Click here to reset