On the Efficacy of Sampling Adapters

by   Clara Meister, et al.
University of Cambridge
ETH Zurich

Sampling is a common strategy for generating text from probabilistic models, yet standard ancestral sampling often results in text that is incoherent or ungrammatical. To alleviate this issue, various modifications to a model's sampling distribution, such as nucleus or top-k sampling, have been introduced and are now ubiquitously used in language generation systems. We propose a unified framework for understanding these techniques, which we term sampling adapters. Sampling adapters often lead to qualitatively better text, which raises the question: From a formal perspective, how are they changing the (sub)word-level distributions of language generation models? And why do these local changes lead to higher-quality text? We argue that the shift they enforce can be viewed as a trade-off between precision and recall: while the model loses its ability to produce certain strings, its precision rate on desirable text increases. While this trade-off is not reflected in standard metrics of distribution quality (such as perplexity), we find that several precision-emphasizing measures indeed indicate that sampling adapters can lead to probability distributions more aligned with the true distribution. Further, these measures correlate with higher sequence-level quality scores, specifically, Mauve.


page 7

page 13

page 14

page 15

page 16


How Decoding Strategies Affect the Verifiability of Generated Text

Language models are of considerable importance. They are used for pretra...

Sampling from Discrete Energy-Based Models with Quality/Efficiency Trade-offs

Energy-Based Models (EBMs) allow for extremely flexible specifications o...

On the probability-quality paradox in language generation

When generating natural language from neural probabilistic models, high ...

Training Normalizing Flows with the Precision-Recall Divergence

Generative models can have distinct mode of failures like mode dropping ...

Conformal Nucleus Sampling

Language models generate text based on successively sampling the next wo...

On Fast Sampling of Diffusion Probabilistic Models

In this work, we propose FastDPM, a unified framework for fast sampling ...

Bilevel Scheduled Sampling for Dialogue Generation

Exposure bias poses a common challenge in numerous natural language proc...

Please sign up or login with your details

Forgot password? Click here to reset