Unsupervised Domain Adaptation for Neural Machine Translation with Iterative Back Translation

01/22/2020
by   Di Jin, et al.
0

State-of-the-art neural machine translation (NMT) systems are data-hungry and perform poorly on domains with little supervised data. As data collection is expensive and infeasible in many cases, unsupervised domain adaptation methods are needed. We apply an Iterative Back Translation (IBT) training scheme on in-domain monolingual data, which repeatedly uses a Transformer-based NMT model to create in-domain pseudo-parallel sentence pairs in one translation direction on the fly and then use them to train the model in the other direction. Evaluated on three domains of German-to-English translation task with no supervised data, this simple technique alone (without any out-of-domain parallel data) can already surpass all previous domain adaptation methods—up to +9.48 BLEU over the strongest previous method, and up to +27.77 BLEU over the unadapted baseline. Moreover, given available supervised out-of-domain data on German-to-English and Romanian-to-English language pairs, we can further enhance the performance and obtain up to +19.31 BLEU improvement over the strongest baseline, and +47.69 BLEU increment against the unadapted model.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/26/2019

An Empirical Study of Domain Adaptation for Unsupervised Neural Machine Translation

Domain adaptation methods have been well-studied in supervised neural ma...
research
02/07/2020

A Multilingual View of Unsupervised Machine Translation

We present a probabilistic framework for multilingual neural machine tra...
research
08/07/2022

Vernacular Search Query Translation with Unsupervised Domain Adaptation

With the democratization of e-commerce platforms, an increasingly divers...
research
07/31/2017

Regularization techniques for fine-tuning in neural machine translation

We investigate techniques for supervised domain adaptation for neural ma...
research
04/07/2020

Dynamic Data Selection and Weighting for Iterative Back-Translation

Back-translation has proven to be an effective method to utilize monolin...
research
10/18/2022

Domain Specific Sub-network for Multi-Domain Neural Machine Translation

This paper presents Domain-Specific Sub-network (DoSS). It uses a set of...
research
04/05/2020

Unsupervised Domain Clusters in Pretrained Language Models

The notion of "in-domain data" in NLP is often over-simplistic and vague...

Please sign up or login with your details

Forgot password? Click here to reset