DeepAI
Log In Sign Up

Multilingual Bidirectional Unsupervised Translation Through Multilingual Finetuning and Back-Translation

09/06/2022
by   Bryan Li, et al.
4

We propose a two-stage training approach for developing a single NMT model to translate unseen languages both to and from English. For the first stage, we initialize an encoder-decoder model to pretrained XLM-R and RoBERTa weights, then perform multilingual fine-tuning on parallel data in 25 languages to English. We find this model can generalize to zero-shot translations on unseen languages. For the second stage, we leverage this generalization ability to generate synthetic parallel data from monolingual datasets, then train with successive rounds of back-translation. The final model extends to the English-to-Many direction, while retaining Many-to-English performance. We term our approach EcXTra (English-centric Crosslingual (X) Transfer). Our approach sequentially leverages auxiliary parallel data and monolingual data, and is conceptually simple, only using a standard cross-entropy objective in both stages. The final EcXTra model is evaluated on unsupervised NMT on 8 low-resource languages achieving a new state-of-the-art for English-to-Kazakh (22.3 > 10.4 BLEU), and competitive performance for the other 15 translation directions.

READ FULL TEXT

page 1

page 2

page 3

page 4

10/16/2021

Towards Making the Most of Multilingual Pretraining for Zero-Shot Neural Machine Translation

This paper demonstrates that multilingual pretraining, a proper fine-tun...
05/11/2020

Leveraging Monolingual Data with Self-Supervision for Multilingual Neural Machine Translation

Over the last few years two promising research directions in low-resourc...
04/18/2021

Zero-shot Cross-lingual Transfer of Neural Machine Translation with Multilingual Pretrained Encoders

Previous works mainly focus on improving cross-lingual transfer for NLU ...
10/14/2021

An Empirical Investigation of Multi-bridge Multilingual NMT models

In this paper, we present an extensive investigation of multi-bridge, ma...
10/24/2020

Cross-Modal Transfer Learning for Multilingual Speech-to-Text Translation

We propose an effective approach to utilize pretrained speech and text m...
10/20/2021

Multilingual Unsupervised Neural Machine Translation with Denoising Adapters

We consider the problem of multilingual unsupervised machine translation...
10/20/2021

Continual Learning in Multilingual NMT via Language-Specific Embeddings

This paper proposes a technique for adding a new source or target langua...