Improving Diversity of Neural Text Generation via Inverse Probability Weighting

03/13/2021
by   Xinran Zhang, et al.
0

The neural network based text generation suffers from the text degeneration issue such as repetition. Although top-k sampling and nucleus sampling outperform beam search based decoding methods, they only focus on truncating the "tail" of the distribution and do not address the "head" part, which we show might contain tedious or even repetitive candidates with high probability that lead to repetition loops. They also do not fully address the issue that human text does not always favor high probability words. To explore improved diversity for text generation, we propose a heuristic sampling method inspired by inverse probability weighting. We propose to use interquartile range of the predicted distribution to determine the "head" part, then permutate and rescale the "head" with inverse probability. This aims at decreasing the probability for the tedious and possibly repetitive candidates with higher probability, and increasing the probability for the rational but more surprising candidates with lower probability. The proposed algorithm provides a controllable variation on the predicted distribution which enhances diversity without compromising rationality of the distribution. We use pre-trained language model to compare our algorithm with nucleus sampling. Results show that our algorithm can effectively increase the diversity of generated samples while achieving close resemblance to human text.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/27/2021

Lingxi: A Diversity-aware Chinese Modern Poetry Generation System

Poetry generation has been a difficult task in natural language processi...
research
06/07/2023

Increasing Diversity While Maintaining Accuracy: Text Data Generation with Large Language Models and Human Interventions

Large language models (LLMs) can be used to generate text data for train...
research
08/12/2019

Neural Text Generation with Unlikelihood Training

Neural text generation is a key tool in natural language applications, b...
research
10/27/2022

Truncation Sampling as Language Model Desmoothing

Long samples of text from neural language models can be of poor quality....
research
04/22/2019

The Curious Case of Neural Text Degeneration

Despite considerable advancements with deep neural language models, the ...
research
03/31/2022

On the probability-quality paradox in language generation

When generating natural language from neural probabilistic models, high ...
research
02/26/2023

Tailoring Language Generation Models under Total Variation Distance

The standard paradigm of neural language generation adopts maximum likel...

Please sign up or login with your details

Forgot password? Click here to reset