On Hallucination and Predictive Uncertainty in Conditional Language Generation

03/28/2021
by   Yijun Xiao, et al.
0

Despite improvements in performances on different natural language generation tasks, deep neural models are prone to hallucinating facts that are incorrect or nonexistent. Different hypotheses are proposed and examined separately for different tasks, but no systematic explanations are available across these tasks. In this study, we draw connections between hallucinations and predictive uncertainty in conditional language generation. We investigate their relationship in both image captioning and data-to-text generation and propose a simple extension to beam search to reduce hallucination. Our analysis shows that higher predictive uncertainty corresponds to a higher chance of hallucination. Epistemic uncertainty is more indicative of hallucination than aleatoric or total uncertainties. It helps to achieve better results of trading performance in standard metric for less hallucination with the proposed beam search variant.

READ FULL TEXT
research
08/31/2018

When to Finish? Optimal Beam Search for Neural Text Generation (modulo beam size)

In neural text generation such as neural machine translation, summarizat...
research
08/17/2019

Leveraging sentence similarity in natural language generation: Improving beam search using range voting

We propose a novel method for generating natural language sentences from...
research
08/07/2020

Perception Score, A Learned Metric for Open-ended Text Generation Evaluation

Automatic evaluation for open-ended natural language generation tasks re...
research
02/04/2021

Incremental Beam Manipulation for Natural Language Generation

The performance of natural language generation systems has improved subs...
research
12/16/2020

AutoCaption: Image Captioning with Neural Architecture Search

Image captioning transforms complex visual information into abstract nat...
research
07/09/2020

Unsupervised Text Generation by Learning from Search

In this work, we present TGLS, a novel framework to unsupervised Text Ge...
research
11/30/2022

Evidential Conditional Neural Processes

The Conditional Neural Process (CNP) family of models offer a promising ...

Please sign up or login with your details

Forgot password? Click here to reset