Controllable Text Generation with Language Constraints

12/20/2022
by   Howard Chen, et al.
0

We consider the task of text generation in language models with constraints specified in natural language. To this end, we first create a challenging benchmark Cognac that provides as input to the model a topic with example text, along with a constraint on text to be avoided. Unlike prior work, our benchmark contains knowledge-intensive constraints sourced from databases like Wordnet and Wikidata, which allows for straightforward evaluation while striking a balance between broad attribute-level and narrow lexical-level controls. We find that even state-of-the-art language models like GPT-3 fail often on this task, and propose a solution to leverage a language model's own internal knowledge to guide generation. Our method, called CognacGen, first queries the language model to generate guidance terms for a specified topic or constraint, and uses the guidance to modify the model's token generation probabilities. We propose three forms of guidance (binary verifier, top-k tokens, textual example), and employ prefix-tuning approaches to distill the guidance to tackle diverse natural language constraints. Through extensive empirical evaluations, we demonstrate that CognacGen can successfully generalize to unseen instructions and outperform competitive baselines in generating constraint conforming text.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/14/2020

Conditioned Natural Language Generation using only Unconditioned Language Model: An Exploration

Transformer-based language models have shown to be very powerful for nat...
research
05/27/2022

Controllable Text Generation with Neurally-Decomposed Oracle

We propose a general and efficient framework to control auto-regressive ...
research
09/19/2023

Toward Unified Controllable Text Generation via Regular Expression Instruction

Controllable text generation is a fundamental aspect of natural language...
research
06/01/2023

Preference-grounded Token-level Guidance for Language Model Fine-tuning

Aligning language models (LMs) with preferences is an important problem ...
research
02/16/2022

XFBoost: Improving Text Generation with Controllable Decoders

Multimodal conditionality in transformer-based natural language models h...
research
05/19/2023

BOLT: Fast Energy-based Controlled Text Generation with Tunable Biases

Energy-based models (EBMs) have gained popularity for controlled text ge...
research
02/17/2023

Bounding the Capabilities of Large Language Models in Open Text Generation with Prompt Constraints

The limits of open-ended generative models are unclear, yet increasingly...

Please sign up or login with your details

Forgot password? Click here to reset