Explaining and Generalizing Back-Translation through Wake-Sleep

06/12/2018
by   Ryan Cotterell, et al.
0

Back-translation has become a commonly employed heuristic for semi-supervised neural machine translation. The technique is both straightforward to apply and has led to state-of-the-art results. In this work, we offer a principled interpretation of back-translation as approximate inference in a generative model of bitext and show how the standard implementation of back-translation corresponds to a single iteration of the wake-sleep algorithm in our proposed model. Moreover, this interpretation suggests a natural iterative generalization, which we demonstrate leads to further improvement of up to 1.6 BLEU.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/08/2020

Neural Machine Translation for South Africa's Official Languages

Recent advances in neural machine translation (NMT) have led to state-of...
research
10/06/2020

Efficient Inference For Neural Machine Translation

Large Transformer models have achieved state-of-the-art results in neura...
research
10/07/2020

Dual Reconstruction: a Unifying Objective for Semi-Supervised Neural Machine Translation

While Iterative Back-Translation and Dual Learning effectively incorpora...
research
06/11/2014

Reweighted Wake-Sleep

Training deep directed graphical models with many hidden variables and p...
research
06/22/2017

Neural Machine Translation with Gumbel-Greedy Decoding

Previous neural machine translation models used some heuristic search al...

Please sign up or login with your details

Forgot password? Click here to reset