Log In Sign Up

A Closer Look at Few-Shot Crosslingual Transfer: Variance, Benchmarks and Baselines

by   Mengjie Zhao, et al.

We present a focused study of few-shot crosslingual transfer, a recently proposed NLP scenario: a pretrained multilingual encoder is first finetuned on many annotations in a high resource language (typically English), and then finetuned on a few annotations (the “few shots”) in a target language. Few-shot transfer brings large improvements over zero-shot transfer. However, we show that it inherently has large variance and it is necessary to report results on multiple sets of few shots for stable results and to guarantee fair comparison of different algorithms. To address this problem, we publish our few-shot sets. In a study of why few-shot learning outperforms zero-shot transfer, we show that large models heavily rely on lexical hints when finetuned on a few shots and then overfit quickly. We evaluate different methods that use few-shot annotations, but do not observe significant improvements over the baseline. This calls for better ways of utilizing the few-shot annotations.


Revisiting the Primacy of English in Zero-shot Cross-lingual Transfer

Despite their success, large pre-trained multilingual models have not co...

Zero-shot Cross-lingual Transfer is Under-specified Optimization

Pretrained multilingual encoders enable zero-shot cross-lingual transfer...

English Intermediate-Task Training Improves Zero-Shot Cross-Lingual Transfer Too

Intermediate-task training has been shown to substantially improve pretr...

MulZDG: Multilingual Code-Switching Framework for Zero-shot Dialogue Generation

Building dialogue generation systems in a zero-shot scenario remains a h...

Analyzing the Effects of Reasoning Types on Cross-Lingual Transfer Performance

Multilingual language models achieve impressive zero-shot accuracies in ...

"Diversity and Uncertainty in Moderation" are the Key to Data Selection for Multilingual Few-shot Transfer

Few-shot transfer often shows substantial gain over zero-shot transfer <...

FLEX: Unifying Evaluation for Few-Shot NLP

Few-shot NLP research is highly active, yet conducted in disjoint resear...