Mix and Match: Learning-free Controllable Text Generation using Energy Language Models

Recent work on controlled text generation has either required attribute-based fine-tuning of the base language model (LM), or has restricted the parameterization of the attribute discriminator to be compatible with the base autoregressive LM. In this work, we propose Mix and Match LM, a global score-based alternative for controllable text generation that combines arbitrary pre-trained black-box models for achieving the desired attributes in the generated text without involving any fine-tuning or structural assumptions about the black-box models. We interpret the task of controllable generation as drawing samples from an energy-based model whose energy values are a linear combination of scores from black-box models that are separately responsible for fluency, the control attribute, and faithfulness to any conditioning context. We use a Metropolis-Hastings sampling scheme to sample from this energy-based model using bidirectional context and global attribute features. We validate the effectiveness of our approach on various controlled generation and style-based text revision tasks by outperforming recently proposed methods that involve extra training, fine-tuning, or restrictive assumptions over the form of models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/01/2023

Focused Prefix Tuning for Controllable Text Generation

In a controllable text generation dataset, there exist unannotated attri...
research
11/22/2022

Linear Interpolation In Parameter Space is Good Enough for Fine-Tuned Language Models

The simplest way to obtain continuous interpolation between two points i...
research
05/15/2022

Mitigating Toxic Degeneration with Empathetic Data: Exploring the Relationship Between Toxicity and Empathy

Large pre-trained neural language models have supported the effectivenes...
research
06/20/2023

Learning to Generate Better Than Your LLM

Reinforcement learning (RL) has emerged as a powerful paradigm for fine-...
research
10/18/2022

DisCup: Discriminator Cooperative Unlikelihood Prompt-tuning for Controllable Text Generation

Prompt learning with immensely large Casual Language Models (CLMs) has b...
research
12/16/2022

DuNST: Dual Noisy Self Training for Semi-Supervised Controllable Text Generation

Self-training (ST) has prospered again in language understanding by augm...
research
09/19/2023

Toward Unified Controllable Text Generation via Regular Expression Instruction

Controllable text generation is a fundamental aspect of natural language...

Please sign up or login with your details

Forgot password? Click here to reset