On-the-Fly Controlled Text Generation with Experts and Anti-Experts

by   Alisa Liu, et al.

Despite recent advances in natural language generation, it remains challenging to control attributes of generated text. We propose DExperts: Decoding-time Experts, a decoding-time method for controlled text generation which combines a pretrained language model with experts and/or anti-experts in an ensemble of language models. Intuitively, under our ensemble, output tokens only get high probability if they are considered likely by the experts, and unlikely by the anti-experts. We apply DExperts to language detoxification and sentiment-controlled generation, where we outperform existing controllable generation methods on both automatic and human evaluations. Our work highlights the promise of using LMs trained on text with (un)desired attributes for efficient decoding-time controlled language generation.


page 7

page 14


Classifiers are Better Experts for Controllable Text Generation

This paper proposes a simple method for controllable text generation bas...

Controllable Text Generation with Focused Variation

This work introduces Focused-Variation Network (FVN), a novel model to c...

A Text Reassembling Approach to NaturalLanguage Generation

Recent years have seen a number of proposals for performing Natural Lang...

CoCon: A Self-Supervised Approach for Controlled Text Generation

Pretrained Transformer-based language models (LMs) display remarkable na...

Typical Decoding for Natural Language Generation

Despite achieving incredibly low perplexities on myriad natural language...

XFBoost: Improving Text Generation with Controllable Decoders

Multimodal conditionality in transformer-based natural language models h...

Improving Diversity of Neural Text Generation via Inverse Probability Weighting

The neural network based text generation suffers from the text degenerat...