Non-Autoregressive Machine Translation with Translation Memories

10/12/2022
by   Jitao Xu, et al.
0

Non-autoregressive machine translation (NAT) has recently made great progress. However, most works to date have focused on standard translation tasks, even though some edit-based NAT models, such as the Levenshtein Transformer (LevT), seem well suited to translate with a Translation Memory (TM). This is the scenario considered here. We first analyze the vanilla LevT model and explain why it does not do well in this setting. We then propose a new variant, TM-LevT, and show how to effectively train this model. By modifying the data presentation and introducing an extra deletion operation, we obtain performance that are on par with an autoregressive approach, while reducing the decoding load. We also show that incorporating TMs during training dispenses to use knowledge distillation, a well-known trick used to mitigate the multimodality issue.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/29/2021

Using Perturbed Length-aware Positional Encoding for Non-autoregressive Neural Machine Translation

Non-autoregressive neural machine translation (NAT) usually employs sequ...
research
05/27/2021

How Does Distilled Data Complexity Impact the Quality and Confidence of Non-Autoregressive Machine Translation?

While non-autoregressive (NAR) models are showing great promise for mach...
research
01/22/2021

Enriching Non-Autoregressive Transformer with Syntactic and SemanticStructures for Neural Machine Translation

The non-autoregressive models have boosted the efficiency of neural mach...
research
10/14/2021

Non-Autoregressive Translation with Layer-Wise Prediction and Deep Supervision

How do we perform efficient inference while retaining high translation q...
research
10/05/2020

Inference Strategies for Machine Translation with Conditional Masking

Conditional masked language model (CMLM) training has proven successful ...
research
10/19/2020

Infusing Sequential Information into Conditional Masked Translation Model with Self-Review Mechanism

Non-autoregressive models generate target words in a parallel way, which...
research
12/01/2022

CUNI Non-Autoregressive System for the WMT 22 Efficient Translation Shared Task

We present a non-autoregressive system submission to the WMT 22 Efficien...

Please sign up or login with your details

Forgot password? Click here to reset