On the probability-quality paradox in language generation

03/31/2022
by   Clara Meister, et al.
0

When generating natural language from neural probabilistic models, high probability does not always coincide with high quality: It has often been observed that mode-seeking decoding methods, i.e., those that produce high-probability text under the model, lead to unnatural language. On the other hand, the lower-probability text generated by stochastic methods is perceived as more human-like. In this note, we offer an explanation for this phenomenon by analyzing language generation through an information-theoretic lens. Specifically, we posit that human-like language should contain an amount of information (quantified as negative log-probability) that is close to the entropy of the distribution over natural strings. Further, we posit that language with substantially more (or less) information is undesirable. We provide preliminary empirical evidence in favor of this hypothesis; quality ratings of both human and machine-generated text – covering multiple tasks and common decoding strategies – suggest high-quality text has an information content significantly closer to the entropy than we would expect by chance.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/29/2022

On Decoding Strategies for Neural Text Generators

When generating text from probabilistic models, the chosen decoding stra...
research
02/01/2022

Typical Decoding for Natural Language Generation

Despite achieving incredibly low perplexities on myriad natural language...
research
02/14/2023

The Stable Entropy Hypothesis and Entropy-Aware Decoding: An Analysis and Algorithm for Robust Natural Language Generation

State-of-the-art language generation models can degenerate when applied ...
research
07/29/2020

Mirostat: A Perplexity-Controlled Neural Text Decoding Algorithm

Neural text decoding is important for generating high-quality texts usin...
research
07/07/2023

On the Efficacy of Sampling Adapters

Sampling is a common strategy for generating text from probabilistic mod...
research
05/19/2023

What Comes Next? Evaluating Uncertainty in Neural Text Generators Against Human Production Variability

In Natural Language Generation (NLG) tasks, for any input, multiple comm...
research
03/13/2021

Improving Diversity of Neural Text Generation via Inverse Probability Weighting

The neural network based text generation suffers from the text degenerat...

Please sign up or login with your details

Forgot password? Click here to reset