DeepAI AI Chat
Log In Sign Up

When Being Unseen from mBERT is just the Beginning: Handling New Languages With Multilingual Language Models

by   Benjamin Müller, et al.

Transfer learning based on pretraining language models on a large amount of raw data has become a new norm to reach state-of-the-art performance in NLP. Still, it remains unclear how this approach should be applied for unseen languages that are not covered by any available large-scale multilingual language model and for which only a small amount of raw data is generally available. In this work, by comparing multilingual and monolingual models, we show that such models behave in multiple ways on unseen languages. Some languages greatly benefit from transfer learning and behave similarly to closely related high resource languages whereas others apparently do not. Focusing on the latter, we show that this failure to transfer is largely related to the impact of the script used to write such languages. Transliterating those languages improves very significantly the ability of large-scale multilingual language models on downstream tasks.


page 1

page 2

page 3

page 4


Can Multilingual Language Models Transfer to an Unseen Dialect? A Case Study on North African Arabizi

Building natural language processing systems for non standardized and lo...

Exploiting Language Relatedness for Low Web-Resource Language Model Adaptation: An Indic Languages Study

Recent research in multilingual language models (LM) has demonstrated th...

Instance-based Transfer Learning for Multilingual Deep Retrieval

Perhaps the simplest type of multilingual transfer learning is instance-...

Ancestor-to-Creole Transfer is Not a Walk in the Park

We aim to learn language models for Creole languages for which large vol...

BigScience: A Case Study in the Social Construction of a Multilingual Large Language Model

The BigScience Workshop was a value-driven initiative that spanned one a...

A Primer on Pretrained Multilingual Language Models

Multilingual Language Models (MLLMs) such as mBERT, XLM, XLM-R, etc. hav...

The Geometry of Multilingual Language Model Representations

We assess how multilingual language models maintain a shared multilingua...