What Comes Next? Evaluating Uncertainty in Neural Text Generators Against Human Production Variability

05/19/2023
by   Mario Giulianelli, et al.
0

In Natural Language Generation (NLG) tasks, for any input, multiple communicative goals are plausible, and any goal can be put into words, or produced, in multiple ways. We characterise the extent to which human production varies lexically, syntactically, and semantically across four NLG tasks, connecting human production variability to aleatoric or data uncertainty. We then inspect the space of output strings shaped by a generation system's predicted probability distribution and decoding algorithm to probe its uncertainty. For each test input, we measure the generator's calibration to human production variability. Following this instance-level approach, we analyse NLG models and decoding strategies, demonstrating that probing a generator with multiple samples and, when possible, multiple references, provides the level of detail necessary to gain understanding of a model's representation of uncertainty.

READ FULL TEXT

page 17

page 18

research
10/23/2022

Towards Pragmatic Production Strategies for Natural Language Generation Tasks

This position paper proposes a conceptual framework for the design of Na...
research
09/20/2021

A Plug-and-Play Method for Controlled Text Generation

Large pre-trained language models have repeatedly shown their ability to...
research
05/28/2022

Teaching Models to Express Their Uncertainty in Words

We show that a GPT-3 model can learn to express uncertainty about its ow...
research
03/31/2022

On the probability-quality paradox in language generation

When generating natural language from neural probabilistic models, high ...
research
09/16/2021

The Language Model Understood the Prompt was Ambiguous: Probing Syntactic Uncertainty Through Generation

Temporary syntactic ambiguities arise when the beginning of a sentence i...
research
04/01/2022

Uncertainty Determines the Adequacy of the Mode and the Tractability of Decoding in Sequence-to-Sequence Models

In many natural language processing (NLP) tasks the same input (e.g. sou...
research
07/28/2023

Uncertainty in Natural Language Generation: From Theory to Applications

Recent advances of powerful Language Models have allowed Natural Languag...

Please sign up or login with your details

Forgot password? Click here to reset