Zero-Shot Cross-Lingual Transfer with Meta Learning

03/05/2020
by   Farhad Nooralahzadeh, et al.
1

Learning what to share between tasks has been a topic of high importance recently, as strategic sharing of knowledge has been shown to improve the performance of downstream tasks. The same applies to sharing between languages, and is especially important when considering the fact that most languages in the world suffer from being under-resourced. In this paper, we consider the setting of training models on multiple different languages at the same time, when little or no data is available for languages other than English. We show that this challenging setup can be approached using meta-learning, where, in addition to training a source language model, another model learns to select which training instances are the most beneficial. We experiment using standard supervised, zero-shot cross-lingual, as well as few-shot cross-lingual settings for different natural language understanding tasks (natural language inference, question answering). Our extensive experimental setup demonstrates the consistent effectiveness of meta-learning, on a total 16 languages. We improve upon state-of-the-art on zero-shot and few-shot NLI and QA tasks on the XNLI and X-WikiRe datasets, respectively. We further conduct a comprehensive analysis which indicates that correlation of typological features between languages can further explain when parameter sharing learned via meta learning is beneficial.

READ FULL TEXT

page 5

page 12

research
11/10/2021

Cross-lingual Adaption Model-Agnostic Meta-Learning for Natural Language Understanding

Meta learning with auxiliary languages has demonstrated promising improv...
research
03/19/2022

Meta-X_NLG: A Meta-Learning Approach Based on Language Clustering for Zero-Shot Cross-Lingual Transfer and Generation

Recently, the NLP community has witnessed a rapid advancement in multili...
research
04/20/2021

X-METRA-ADA: Cross-lingual Meta-Transfer Learning Adaptation to Natural Language Understanding and Question Answering

Multilingual models, such as M-BERT and XLM-R, have gained increasing po...
research
03/08/2021

Meta-Learning with MAML on Trees

In meta-learning, the knowledge learned from previous tasks is transferr...
research
06/02/2021

Minimax and Neyman-Pearson Meta-Learning for Outlier Languages

Model-agnostic meta-learning (MAML) has been recently put forth as a str...
research
01/27/2021

Multilingual and cross-lingual document classification: A meta-learning approach

The great majority of languages in the world are considered under-resour...
research
04/16/2020

Cross-lingual Contextualized Topic Models with Zero-shot Learning

Many data sets in a domain (reviews, forums, news, etc.) exist in parall...

Please sign up or login with your details

Forgot password? Click here to reset