On-the-Fly Controlled Text Generation with Experts and Anti-Experts

by   Alisa Liu, et al.

Despite recent advances in natural language generation, it remains challenging to control attributes of generated text. We propose DExperts: Decoding-time Experts, a decoding-time method for controlled text generation which combines a pretrained language model with experts and/or anti-experts in an ensemble of language models. Intuitively, under our ensemble, output tokens only get high probability if they are considered likely by the experts, and unlikely by the anti-experts. We apply DExperts to language detoxification and sentiment-controlled generation, where we outperform existing controllable generation methods on both automatic and human evaluations. Our work highlights the promise of using LMs trained on text with (un)desired attributes for efficient decoding-time controlled language generation.


page 7

page 14


A Frustratingly Simple Decoding Method for Neural Text Generation

We introduce a frustratingly simple, super efficient and surprisingly ef...

PREADD: Prefix-Adaptive Decoding for Controlled Text Generation

We propose Prefix-Adaptive Decoding (PREADD), a flexible method for cont...

PCFG-based Natural Language Interface Improves Generalization for Controlled Text Generation

Existing work on controlled text generation (CTG) assumes a control inte...

Critic-Guided Decoding for Controlled Text Generation

Steering language generation towards objectives or away from undesired c...

A Text Reassembling Approach to NaturalLanguage Generation

Recent years have seen a number of proposals for performing Natural Lang...

On the Risks of Stealing the Decoding Algorithms of Language Models

A key component of generating text from modern language models (LM) is t...

An Invariant Learning Characterization of Controlled Text Generation

Controlled generation refers to the problem of creating text that contai...

Please sign up or login with your details

Forgot password? Click here to reset