Language-Family Adapters for Multilingual Neural Machine Translation

09/30/2022
by   Alexandra Chronopoulou, et al.
0

Massively multilingual models pretrained on abundant corpora with self-supervision achieve state-of-the-art results in a wide range of natural language processing tasks. In machine translation, multilingual pretrained models are often fine-tuned on parallel data from one or multiple language pairs. Multilingual fine-tuning improves performance on medium- and low-resource languages but requires modifying the entire model and can be prohibitively expensive. Training a new set of adapters on each language pair or training a single set of adapters on all language pairs while keeping the pretrained model's parameters frozen has been proposed as a parameter-efficient alternative. However, the former do not permit any sharing between languages, while the latter share parameters for all languages and have to deal with negative interference. In this paper, we propose training language-family adapters on top of a pretrained multilingual model to facilitate cross-lingual transfer. Our model consistently outperforms other adapter-based approaches. We also demonstrate that language-family adapters provide an effective method to translate to languages unseen during pretraining.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/29/2022

Extending the Subwording Model of Multilingual Pretrained Models for New Languages

Multilingual pretrained models are effective for machine translation and...
research
02/23/2022

Using natural language prompts for machine translation

We explore the use of natural language prompts for controlling various a...
research
11/02/2020

Emergent Communication Pretraining for Few-Shot Machine Translation

While state-of-the-art models that rely upon massively multilingual pret...
research
09/10/2021

Efficient Test Time Adapter Ensembling for Low-resource Language Varieties

Adapters are light-weight modules that allow parameter-efficient fine-tu...
research
01/19/2022

Improving Neural Machine Translation by Denoising Training

We present a simple and effective pretraining strategy Denoising Trainin...
research
07/14/2022

Learning to translate by learning to communicate

We formulate and test a technique to use Emergent Communication (EC) wit...
research
05/22/2022

Multilingual Machine Translation with Hyper-Adapters

Multilingual machine translation suffers from negative interference acro...

Please sign up or login with your details

Forgot password? Click here to reset