Low-Resource Machine Translation Training Curriculum Fit for Low-Resource Languages

11/30/2021
by   Isidora Chara Tourni, et al.
0

We conduct an empirical study of neural machine translation (NMT) for truly low-resource languages, and propose a training curriculum fit for cases when both parallel training data and compute resource are lacking, reflecting the reality of most of the world's languages and the researchers working on these languages. Previously, unsupervised NMT, which employs back-translation (BT) and auto-encoding (AE) tasks has been shown barren for low-resource languages. We demonstrate that leveraging comparable data and code-switching as weak supervision, combined with BT and AE objectives, result in remarkable improvements for low-resource languages even when using only modest compute resources. The training curriculum proposed in this work achieves BLEU scores that improve over supervised NMT trained on the same backbone architecture by +12.2 BLEU for English to Gujarati and +3.7 BLEU for English to Kazakh, showcasing the potential of weakly-supervised NMT for the low-resource languages. When trained on supervised data, our training curriculum achieves a new state-of-the-art result on the Somali dataset (BLEU of 29.3 for Somali to English). We also observe that adding more time and GPUs to training can further improve performance, which underscores the importance of reporting compute resource usage in MT research.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/24/2021

Low-Resource Machine Translation for Low-Resource Languages: Leveraging Comparable Data, Code-Switching and Compute Resources

We conduct an empirical study of unsupervised neural machine translation...
research
05/28/2019

Revisiting Low-Resource Neural Machine Translation: A Case Study

It has been shown that the performance of neural machine translation (NM...
research
11/07/2019

Low-Resource Machine Translation using Interlinear Glosses

Neural Machine Translation (NMT) does not handle low-resource translatio...
research
06/22/2023

xSIM++: An Improved Proxy to Bitext Mining Performance for Low-Resource Languages

We introduce a new proxy score for evaluating bitext mining based on sim...
research
04/09/2022

Towards Better Chinese-centric Neural Machine Translation for Low-resource Languages

The last decade has witnessed enormous improvements in science and techn...
research
11/21/2018

Neural Machine Translation based Word Transduction Mechanisms for Low-Resource Languages

Out-Of-Vocabulary (OOV) words can pose serious challenges for machine tr...
research
10/05/2021

Sicilian Translator: A Recipe for Low-Resource NMT

With 17,000 pairs of Sicilian-English translated sentences, Arba Sicula ...

Please sign up or login with your details

Forgot password? Click here to reset