DeepAI AI Chat
Log In Sign Up

Unsupervised Domain Adaptation for Neural Machine Translation with Iterative Back Translation

by   Di Jin, et al.

State-of-the-art neural machine translation (NMT) systems are data-hungry and perform poorly on domains with little supervised data. As data collection is expensive and infeasible in many cases, unsupervised domain adaptation methods are needed. We apply an Iterative Back Translation (IBT) training scheme on in-domain monolingual data, which repeatedly uses a Transformer-based NMT model to create in-domain pseudo-parallel sentence pairs in one translation direction on the fly and then use them to train the model in the other direction. Evaluated on three domains of German-to-English translation task with no supervised data, this simple technique alone (without any out-of-domain parallel data) can already surpass all previous domain adaptation methods—up to +9.48 BLEU over the strongest previous method, and up to +27.77 BLEU over the unadapted baseline. Moreover, given available supervised out-of-domain data on German-to-English and Romanian-to-English language pairs, we can further enhance the performance and obtain up to +19.31 BLEU improvement over the strongest baseline, and +47.69 BLEU increment against the unadapted model.


page 1

page 2

page 3

page 4


An Empirical Study of Domain Adaptation for Unsupervised Neural Machine Translation

Domain adaptation methods have been well-studied in supervised neural ma...

A Multilingual View of Unsupervised Machine Translation

We present a probabilistic framework for multilingual neural machine tra...

Vernacular Search Query Translation with Unsupervised Domain Adaptation

With the democratization of e-commerce platforms, an increasingly divers...

Regularization techniques for fine-tuning in neural machine translation

We investigate techniques for supervised domain adaptation for neural ma...

Domain Specific Sub-network for Multi-Domain Neural Machine Translation

This paper presents Domain-Specific Sub-network (DoSS). It uses a set of...

Unsupervised Domain Clusters in Pretrained Language Models

The notion of "in-domain data" in NLP is often over-simplistic and vague...

Yall should read this! Identifying Plurality in Second-Person Personal Pronouns in English Texts

Distinguishing between singular and plural "you" in English is a challen...