Modeling Coverage for Non-Autoregressive Neural Machine Translation

04/24/2021
by   Yong Shan, et al.
0

Non-Autoregressive Neural Machine Translation (NAT) has achieved significant inference speedup by generating all tokens simultaneously. Despite its high efficiency, NAT usually suffers from two kinds of translation errors: over-translation (e.g. repeated tokens) and under-translation (e.g. missing translations), which eventually limits the translation quality. In this paper, we argue that these issues of NAT can be addressed through coverage modeling, which has been proved to be useful in autoregressive decoding. We propose a novel Coverage-NAT to model the coverage information directly by a token-level coverage iterative refinement mechanism and a sentence-level coverage agreement, which can remind the model if a source token has been translated or not and improve the semantics consistency between the translation and the source, respectively. Experimental results on WMT14 En-De and WMT16 En-Ro translation tasks show that our method can alleviate those errors and achieve strong improvements over the baseline system.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/09/2020

Learning to Recover from Multi-Modality Errors for Non-Autoregressive Neural Machine Translation

Non-autoregressive neural machine translation (NAT) predicts the entire ...
research
11/06/2019

Guiding Non-Autoregressive Neural Machine Translation Decoding with Reordering Information

Non-autoregressive neural machine translation (NAT) generates each targe...
research
02/22/2019

Non-Autoregressive Machine Translation with Auxiliary Regularization

As a new neural machine translation approach, Non-Autoregressive machine...
research
11/21/2019

Minimizing the Bag-of-Ngrams Difference for Non-Autoregressive Neural Machine Translation

Non-Autoregressive Neural Machine Translation (NAT) achieves significant...
research
05/26/2021

Bilingual Mutual Information Based Adaptive Training for Neural Machine Translation

Recently, token-level adaptive training has achieved promising improveme...
research
06/10/2022

A Novel Chinese Dialect TTS Frontend with Non-Autoregressive Neural Machine Translation

Chinese dialect text-to-speech(TTS) system usually can only be utilized ...
research
06/15/2021

Sequence-Level Training for Non-Autoregressive Neural Machine Translation

In recent years, Neural Machine Translation (NMT) has achieved notable r...

Please sign up or login with your details

Forgot password? Click here to reset