Look-back Decoding for Open-Ended Text Generation

05/22/2023
by   Nan Xu, et al.
0

Given a prefix (context), open-ended generation aims to decode texts that are coherent, which don't abruptly drift from previous topics, and informative, which don't suffer from undesired repetitions. In this paper, we propose Look-back, an improved decoding algorithm that leverages the Kullback-Leibler divergence to track the distribution distance between current and historical decoding steps. Thus Look-back can automatically predict potential repetitive phrase and topic drift, and remove tokens that may cause the failure modes, restricting the next token probability distribution within a plausible distance to the history. We perform decoding experiments on document continuation and story generation, and demonstrate that Look-back is able to generate more fluent and coherent text, outperforming other strong decoding methods significantly in both automatic and human evaluations.

READ FULL TEXT
research
02/02/2021

MAUVE: Human-Machine Divergence Curves for Evaluating Open-Ended Text Generation

Despite major advances in open-ended text generation, there has been lim...
research
09/20/2021

A Plug-and-Play Method for Controlled Text Generation

Large pre-trained language models have repeatedly shown their ability to...
research
04/18/2020

Exclusive Hierarchical Decoding for Deep Keyphrase Generation

Keyphrase generation (KG) aims to summarize the main ideas of a document...
research
10/26/2022

MOCHA: A Multi-Task Training Approach for Coherent Text Generation from Cognitive Perspective

Teaching neural models to generate narrative coherent texts is a critica...
research
10/19/2016

A Theme-Rewriting Approach for Generating Algebra Word Problems

Texts present coherent stories that have a particular theme or overall s...
research
10/27/2022

Contrastive Decoding: Open-ended Text Generation as Optimization

Likelihood, although useful as a training loss, is a poor search objecti...
research
11/19/2022

An Empirical Study On Contrastive Search And Contrastive Decoding For Open-ended Text Generation

In the study, we empirically compare the two recently proposed decoding ...

Please sign up or login with your details

Forgot password? Click here to reset