Cross-Lingual Transfer with Target Language-Ready Task Adapters

06/05/2023
by   Marinela Parovic, et al.
0

Adapters have emerged as a modular and parameter-efficient approach to (zero-shot) cross-lingual transfer. The established MAD-X framework employs separate language and task adapters which can be arbitrarily combined to perform the transfer of any task to any target language. Subsequently, BAD-X, an extension of the MAD-X framework, achieves improved transfer at the cost of MAD-X's modularity by creating "bilingual" adapters specific to the source-target language pair. In this work, we aim to take the best of both worlds by (i) fine-tuning task adapters adapted to the target language(s) (so-called "target language-ready" (TLR) adapters) to maintain high transfer performance, but (ii) without sacrificing the highly modular design of MAD-X. The main idea of "target language-ready" adapters is to resolve the training-vs-inference discrepancy of MAD-X: the task adapter "sees" the target language adapter for the very first time during inference, and thus might not be fully compatible with it. We address this mismatch by exposing the task adapter to the target language adapter during training, and empirically validate several variants of the idea: in the simplest form, we alternate between using the source and target language adapters during task adapter training, which can be generalized to cycling over any set of language adapters. We evaluate different TLR-based transfer configurations with varying degrees of generality across a suite of standard cross-lingual benchmarks, and find that the most general (and thus most modular) configuration consistently outperforms MAD-X and BAD-X on most tasks and languages.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/04/2023

DiTTO: A Feature Representation Imitation Approach for Improving Cross-Lingual Transfer

Zero-shot cross-lingual transfer is promising, however has been shown to...
research
07/12/2022

Zero-shot Cross-lingual Transfer is Under-specified Optimization

Pretrained multilingual encoders enable zero-shot cross-lingual transfer...
research
12/11/2020

Orthogonal Language and Task Adapters in Zero-Shot Cross-Lingual Transfer

Adapter modules, additional trainable parameters that enable efficient f...
research
09/12/2023

Measuring Catastrophic Forgetting in Cross-Lingual Transfer Paradigms: Exploring Tuning Strategies

The cross-lingual transfer is a promising technique to solve tasks in le...
research
06/02/2023

Distilling Efficient Language-Specific Models for Cross-Lingual Transfer

Massively multilingual Transformers (MMTs), such as mBERT and XLM-R, are...
research
04/29/2022

Por Qué Não Utiliser Alla Språk? Mixed Training with Gradient Optimization in Few-Shot Cross-Lingual Transfer

The current state-of-the-art for few-shot cross-lingual transfer learnin...
research
04/28/2020

Learning to Learn Morphological Inflection for Resource-Poor Languages

We propose to cast the task of morphological inflection - mapping a lemm...

Please sign up or login with your details

Forgot password? Click here to reset