Reusing a Pretrained Language Model on Languages with Limited Corpora for Unsupervised NMT

09/16/2020
by   Alexandra Chronopoulou, et al.
0

Using a language model (LM) pretrained on two languages with large monolingual data in order to initialize an unsupervised neural machine translation (UNMT) system yields state-of-the-art results. When limited data is available for one language, however, this method leads to poor translations. We present an effective approach that reuses an LM that is pretrained only on the high-resource language. The monolingual LM is fine-tuned on both languages and is then used to initialize a UNMT model. To reuse the pretrained LM, we have to modify its predefined vocabulary, to account for the new language. We therefore propose a novel vocabulary extension method. Our approach, RE-LM, outperforms a competitive cross-lingual pretraining model (XLM) in English-Macedonian (En-Mk) and English-Albanian (En-Sq), yielding more than +8.3 BLEU points for all four translation directions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/22/2019

Cross-lingual Language Model Pretraining

Recent studies have demonstrated the efficiency of generative pretrainin...
research
05/22/2023

PrOnto: Language Model Evaluations for 859 Languages

Evaluation datasets are critical resources for measuring the quality of ...
research
11/29/2022

Extending the Subwording Model of Multilingual Pretrained Models for New Languages

Multilingual pretrained models are effective for machine translation and...
research
03/18/2021

Improving the Lexical Ability of Pretrained Language Models for Unsupervised Neural Machine Translation

Successful methods for unsupervised neural machine translation (UNMT) em...
research
01/26/2022

A Unified Strategy for Multilingual Grammatical Error Correction with Pre-trained Cross-Lingual Language Model

Synthetic data construction of Grammatical Error Correction (GEC) for no...
research
03/17/2022

Expanding Pretrained Models to Thousands More Languages via Lexicon-based Adaptation

The performance of multilingual pretrained models is highly dependent on...
research
03/12/2021

Bilingual Dictionary-based Language Model Pretraining for Neural Machine Translation

Recent studies have demonstrated a perceivable improvement on the perfor...

Please sign up or login with your details

Forgot password? Click here to reset