NeuroLogic Decoding: (Un)supervised Neural Text Generation with Predicate Logic Constraints

10/24/2020
by   Ximing Lu, et al.
0

Conditional text generation often requires lexical constraints, i.e., which words should or shouldn't be included in the output text. While the dominant recipe for conditional text generation has been large-scale pretrained language models that are finetuned on the task-specific training data, such models do not learn to follow the underlying constraints reliably, even when supervised with large amounts of task-specific examples. We propose NeuroLogic Decoding, a simple yet effective algorithm that enables neural language models – supervised or not – to generate fluent text while satisfying complex lexical constraints. Our approach is powerful yet efficient. It handles any set of lexical constraints that is expressible under predicate logic, while its asymptotic runtime is equivalent to conventional beam search. Empirical results on four benchmarks show that NeuroLogic Decoding outperforms previous approaches, including algorithms that handle a subset of our constraints. Moreover, we find that unsupervised models with NeuroLogic Decoding often outperform supervised models with conventional decoding, even when the latter is based on considerably larger networks. Our results suggest the limit of large-scale neural networks for fine-grained controllable generation and the promise of inference-time algorithms.

READ FULL TEXT

page 1

page 5

research
11/15/2022

AutoTemplate: A Simple Recipe for Lexically Constrained Text Generation

Lexically constrained text generation is one of the constrained text gen...
research
04/18/2021

Extract, Denoise, and Enforce: Evaluating and Predicting Lexical Constraints for Conditional Text Generation

Recently, pre-trained language models (PLMs) have dominated conditional ...
research
10/16/2020

Reflective Decoding: Unsupervised Paraphrasing and Abductive Reasoning

Pretrained Language Models (LMs) generate text with remarkable quality, ...
research
03/08/2023

On the Risks of Stealing the Decoding Algorithms of Language Models

A key component of generating text from modern language models (LM) is t...
research
03/11/2021

ENTRUST: Argument Reframing with Language Models and Entailment

"Framing" involves the positive or negative presentation of an argument ...
research
09/19/2023

Toward Unified Controllable Text Generation via Regular Expression Instruction

Controllable text generation is a fundamental aspect of natural language...
research
09/11/2019

CTRL: A Conditional Transformer Language Model for Controllable Generation

Large-scale language models show promising text generation capabilities,...

Please sign up or login with your details

Forgot password? Click here to reset