On the Efficacy of Sampling Adapters

07/07/2023
by   Clara Meister, et al.
0

Sampling is a common strategy for generating text from probabilistic models, yet standard ancestral sampling often results in text that is incoherent or ungrammatical. To alleviate this issue, various modifications to a model's sampling distribution, such as nucleus or top-k sampling, have been introduced and are now ubiquitously used in language generation systems. We propose a unified framework for understanding these techniques, which we term sampling adapters. Sampling adapters often lead to qualitatively better text, which raises the question: From a formal perspective, how are they changing the (sub)word-level distributions of language generation models? And why do these local changes lead to higher-quality text? We argue that the shift they enforce can be viewed as a trade-off between precision and recall: while the model loses its ability to produce certain strings, its precision rate on desirable text increases. While this trade-off is not reflected in standard metrics of distribution quality (such as perplexity), we find that several precision-emphasizing measures indeed indicate that sampling adapters can lead to probability distributions more aligned with the true distribution. Further, these measures correlate with higher sequence-level quality scores, specifically, Mauve.

READ FULL TEXT

page 7

page 13

page 14

page 15

page 16

research
11/09/2019

How Decoding Strategies Affect the Verifiability of Generated Text

Language models are of considerable importance. They are used for pretra...
research
12/10/2021

Sampling from Discrete Energy-Based Models with Quality/Efficiency Trade-offs

Energy-Based Models (EBMs) allow for extremely flexible specifications o...
research
03/31/2022

On the probability-quality paradox in language generation

When generating natural language from neural probabilistic models, high ...
research
02/01/2023

Training Normalizing Flows with the Precision-Recall Divergence

Generative models can have distinct mode of failures like mode dropping ...
research
05/04/2023

Conformal Nucleus Sampling

Language models generate text based on successively sampling the next wo...
research
05/31/2021

On Fast Sampling of Diffusion Probabilistic Models

In this work, we propose FastDPM, a unified framework for fast sampling ...
research
09/05/2023

Bilevel Scheduled Sampling for Dialogue Generation

Exposure bias poses a common challenge in numerous natural language proc...

Please sign up or login with your details

Forgot password? Click here to reset