A Survey on Non-Autoregressive Generation for Neural Machine Translation and Beyond

04/20/2022
by   Yisheng Xiao, et al.
0

Non-autoregressive (NAR) generation, which is first proposed in neural machine translation (NMT) to speed up inference, has attracted much attention in both machine learning and natural language processing communities. While NAR generation can significantly accelerate inference speed for machine translation, the speedup comes at the cost of sacrificed translation accuracy compared to its counterpart, auto-regressive (AR) generation. In recent years, many new models and algorithms have been designed/proposed to bridge the accuracy gap between NAR generation and AR generation. In this paper, we conduct a systematic survey with comparisons and discussions of various non-autoregressive translation (NAT) models from different aspects. Specifically, we categorize the efforts of NAT into several groups, including data manipulation, modeling methods, training criterion, decoding algorithms, and the benefit from pre-trained models. Furthermore, we briefly review other applications of NAR models beyond machine translation, such as dialogue generation, text summarization, grammar error correction, semantic parsing, speech synthesis, and automatic speech recognition. In addition, we also discuss potential directions for future exploration, including releasing the dependency of KD, dynamic length prediction, pre-training for NAR, and wider applications, etc. We hope this survey can help researchers capture the latest progress in NAR generation, inspire the design of advanced NAR models and algorithms, and enable industry practitioners to choose appropriate solutions for their applications. The web page of this survey is at <https://github.com/LitterBrother-Xiao/Overview-of-Non-autoregressive-Applications>.

READ FULL TEXT
research
06/05/2019

Imitation Learning for Non-Autoregressive Neural Machine Translation

Non-autoregressive translation models (NAT) have achieved impressive inf...
research
04/22/2020

A Study of Non-autoregressive Model for Sequence Generation

Non-autoregressive (NAR) models generate all the tokens of a sequence in...
research
04/24/2023

Directed Acyclic Transformer Pre-training for High-quality Non-autoregressive Text Generation

Non-AutoRegressive (NAR) text generation models have drawn much attentio...
research
03/13/2023

AMOM: Adaptive Masking over Masking for Conditional Masked Language Model

Transformer-based autoregressive (AR) methods have achieved appealing pe...
research
10/11/2021

A Comparative Study on Non-Autoregressive Modelings for Speech-to-Text Generation

Non-autoregressive (NAR) models simultaneously generate multiple outputs...
research
09/04/2020

Recent Trends in the Use of Deep Learning Models for Grammar Error Handling

Grammar error handling (GEH) is an important topic in natural language p...
research
06/29/2021

A Survey on Neural Speech Synthesis

Text to speech (TTS), or speech synthesis, which aims to synthesize inte...

Please sign up or login with your details

Forgot password? Click here to reset