Dual Reconstruction: a Unifying Objective for Semi-Supervised Neural Machine Translation

10/07/2020
by   Weijia Xu, et al.
0

While Iterative Back-Translation and Dual Learning effectively incorporate monolingual training data in neural machine translation, they use different objectives and heuristic gradient approximation strategies, and have not been extensively compared. We introduce a novel dual reconstruction objective that provides a unified view of Iterative Back-Translation and Dual Learning. It motivates a theoretical analysis and controlled empirical study on German-English and Turkish-English tasks, which both suggest that Iterative Back-Translation is more effective than Dual Learning despite its relative simplicity.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/01/2016

Dual Learning for Machine Translation

While neural machine translation (NMT) is making good progress in the pa...
research
07/03/2017

Dual Supervised Learning

Many supervised learning tasks are emerged in dual forms, e.g., English-...
research
05/17/2020

Dual Learning: Theoretical Study and an Algorithmic Extension

Dual learning has been successfully applied in many machine learning app...
research
06/12/2018

Explaining and Generalizing Back-Translation through Wake-Sleep

Back-translation has become a commonly employed heuristic for semi-super...
research
08/22/2019

Dual Skew Divergence Loss for Neural Machine Translation

For neural sequence model training, maximum likelihood (ML) has been com...
research
09/27/2019

On the use of BERT for Neural Machine Translation

Exploiting large pretrained models for various NMT tasks have gained a l...
research
09/23/2019

Data Ordering Patterns for Neural Machine Translation: An Empirical Study

Recent works show that ordering of the training data affects the model p...

Please sign up or login with your details

Forgot password? Click here to reset