Rephrasing the Reference for Non-Autoregressive Machine Translation

11/30/2022
by   Chenze Shao, et al.
0

Non-autoregressive neural machine translation (NAT) models suffer from the multi-modality problem that there may exist multiple possible translations of a source sentence, so the reference sentence may be inappropriate for the training when the NAT output is closer to other translations. In response to this problem, we introduce a rephraser to provide a better training target for NAT by rephrasing the reference sentence according to the NAT output. As we train NAT based on the rephraser output rather than the reference sentence, the rephraser output should fit well with the NAT output and not deviate too far from the reference, which can be quantified as reward functions and optimized by reinforcement learning. Experiments on major WMT benchmarks and NAT baselines show that our approach consistently improves the translation quality of NAT. Specifically, our best variant achieves comparable performance to the autoregressive Transformer, while being 14.7 times more efficient in inference.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/28/2022

One Reference Is Not Enough: Diverse Distillation with Reference Selection for Non-Autoregressive Translation

Non-autoregressive neural machine translation (NAT) suffers from the mul...
research
03/12/2023

Fuzzy Alignments in Directed Acyclic Graph for Non-Autoregressive Machine Translation

Non-autoregressive translation (NAT) reduces the decoding latency but su...
research
04/04/2019

Differentiable Sampling with Flexible Reference Word Order for Neural Machine Translation

Despite some empirical success at correcting exposure bias in machine tr...
research
05/28/2021

Reinforcement Learning for on-line Sequence Transformation

A number of problems in the processing of sound and natural language, as...
research
05/22/2023

Non-Autoregressive Document-Level Machine Translation (NA-DMT): Exploring Effective Approaches, Challenges, and Opportunities

Non-autoregressive translation (NAT) models have been extensively invest...
research
02/22/2019

Non-Autoregressive Machine Translation with Auxiliary Regularization

As a new neural machine translation approach, Non-Autoregressive machine...
research
12/16/2022

Detecting and Mitigating Hallucinations in Machine Translation: Model Internal Workings Alone Do Well, Sentence Similarity Even Better

While the problem of hallucinations in neural machine translation has lo...

Please sign up or login with your details

Forgot password? Click here to reset