On-the-Fly Controlled Text Generation with Experts and Anti-Experts

05/07/2021
by   Alisa Liu, et al.
0

Despite recent advances in natural language generation, it remains challenging to control attributes of generated text. We propose DExperts: Decoding-time Experts, a decoding-time method for controlled text generation which combines a pretrained language model with experts and/or anti-experts in an ensemble of language models. Intuitively, under our ensemble, output tokens only get high probability if they are considered likely by the experts, and unlikely by the anti-experts. We apply DExperts to language detoxification and sentiment-controlled generation, where we outperform existing controllable generation methods on both automatic and human evaluations. Our work highlights the promise of using LMs trained on text with (un)desired attributes for efficient decoding-time controlled language generation.

READ FULL TEXT

page 7

page 14

05/15/2022

Classifiers are Better Experts for Controllable Text Generation

This paper proposes a simple method for controllable text generation bas...
09/25/2020

Controllable Text Generation with Focused Variation

This work introduces Focused-Variation Network (FVN), a novel model to c...
05/16/2020

A Text Reassembling Approach to NaturalLanguage Generation

Recent years have seen a number of proposals for performing Natural Lang...
06/05/2020

CoCon: A Self-Supervised Approach for Controlled Text Generation

Pretrained Transformer-based language models (LMs) display remarkable na...
02/01/2022

Typical Decoding for Natural Language Generation

Despite achieving incredibly low perplexities on myriad natural language...
02/16/2022

XFBoost: Improving Text Generation with Controllable Decoders

Multimodal conditionality in transformer-based natural language models h...
03/13/2021

Improving Diversity of Neural Text Generation via Inverse Probability Weighting

The neural network based text generation suffers from the text degenerat...