Retrieving Sequential Information for Non-Autoregressive Neural Machine Translation

06/22/2019
by   Chenze Shao, et al.
1

Non-Autoregressive Transformer (NAT) aims to accelerate the Transformer model through discarding the autoregressive mechanism and generating target words independently, which fails to exploit the target sequential information. Over-translation and under-translation errors often occur for the above reason, especially in the long sentence translation scenario. In this paper, we propose two approaches to retrieve the target sequential information for NAT to enhance its translation ability while preserving the fast-decoding property. Firstly, we propose a sequence-level training method based on a novel reinforcement algorithm for NAT (Reinforce-NAT) to reduce the variance and stabilize the training procedure. Secondly, we propose an innovative Transformer decoder named FS-decoder to fuse the target sequential information into the top layer of the decoder. Experimental results on three translation tasks show that the Reinforce-NAT surpasses the baseline NAT system by a significant margin on BLEU without decelerating the decoding speed and the FS-decoder achieves comparable translation performance to the autoregressive Transformer with considerable speedup.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/21/2019

Minimizing the Bag-of-Ngrams Difference for Non-Autoregressive Neural Machine Translation

Non-Autoregressive Neural Machine Translation (NAT) achieves significant...
research
10/19/2020

Infusing Sequential Information into Conditional Masked Translation Model with Self-Review Mechanism

Non-autoregressive models generate target words in a parallel way, which...
research
10/11/2022

Viterbi Decoding of Directed Acyclic Transformer for Non-Autoregressive Machine Translation

Non-autoregressive models achieve significant decoding speedup in neural...
research
09/28/2021

Nana-HDR: A Non-attentive Non-autoregressive Hybrid Model for TTS

This paper presents Nana-HDR, a new non-attentive non-autoregressive mod...
research
06/15/2021

Sequence-Level Training for Non-Autoregressive Neural Machine Translation

In recent years, Neural Machine Translation (NMT) has achieved notable r...
research
06/06/2019

Syntactically Supervised Transformers for Faster Neural Machine Translation

Standard decoders for neural machine translation autoregressively genera...
research
12/23/2018

Non-Autoregressive Neural Machine Translation with Enhanced Decoder Input

Non-autoregressive translation (NAT) models, which remove the dependence...

Please sign up or login with your details

Forgot password? Click here to reset