Nearest Neighbor Non-autoregressive Text Generation

08/26/2022
by   Ayana Niwa, et al.
0

Non-autoregressive (NAR) models can generate sentences with less computation than autoregressive models but sacrifice generation quality. Previous studies addressed this issue through iterative decoding. This study proposes using nearest neighbors as the initial state of an NAR decoder and editing them iteratively. We present a novel training strategy to learn the edit operations on neighbors to improve NAR text generation. Experimental results show that the proposed method (NeighborEdit) achieves higher translation quality (1.69 points higher than the vanilla Transformer) with fewer decoding iterations (one-eighteenth fewer iterations) on the JRC-Acquis En-De dataset, the common benchmark dataset for machine translation using nearest neighbors. We also confirm the effectiveness of the proposed method on a data-to-text task (WikiBio). In addition, the proposed method outperforms an NAR baseline on the WMT'14 En-De dataset. We also report analysis on neighbor examples used in the proposed method.

READ FULL TEXT
research
11/25/2019

Non-autoregressive Transformer by Position Learning

Non-autoregressive models are promising on various text generation tasks...
research
02/16/2021

Non-Autoregressive Text Generation with Pre-trained Language Models

Non-autoregressive generation (NAG) has recently attracted great attenti...
research
01/20/2021

Generating (Formulaic) Text by Splicing Together Nearest Neighbors

We propose to tackle conditional text generation tasks, especially those...
research
06/01/2020

Cascaded Text Generation with Markov Transformers

The two dominant approaches to neural text generation are fully autoregr...
research
05/31/2021

Sketch and Refine: Towards Faithful and Informative Table-to-Text Generation

Table-to-text generation refers to generating a descriptive text from a ...
research
05/06/2023

An Adversarial Non-Autoregressive Model for Text Generation with Incomplete Information

Non-autoregressive models have been widely studied in the Complete Infor...
research
12/13/2021

Step-unrolled Denoising Autoencoders for Text Generation

In this paper we propose a new generative model of text, Step-unrolled D...

Please sign up or login with your details

Forgot password? Click here to reset