Cross-Lingual Ability of Multilingual Masked Language Models: A Study of Language Structure

03/16/2022
by   Yuan Chai, et al.
0

Multilingual pre-trained language models, such as mBERT and XLM-R, have shown impressive cross-lingual ability. Surprisingly, both of them use multilingual masked language model (MLM) without any cross-lingual supervision or aligned data. Despite the encouraging results, we still lack a clear understanding of why cross-lingual ability could emerge from multilingual MLM. In our work, we argue that cross-language ability comes from the commonality between languages. Specifically, we study three language properties: constituent order, composition and word co-occurrence. First, we create an artificial language by modifying property in source language. Then we study the contribution of modified property through the change of cross-language transfer results on target language. We conduct experiments on six languages and two cross-lingual NLP tasks (textual entailment, sentence retrieval). Our main conclusion is that the contribution of constituent order and word co-occurrence is limited, while the composition is more crucial to the success of cross-linguistic transfer.

READ FULL TEXT
research
12/17/2019

Cross-Lingual Ability of Multilingual BERT: An Empirical Study

Recent work has exhibited the surprising cross-lingual abilities of mult...
research
09/16/2023

X-PARADE: Cross-Lingual Textual Entailment and Information Divergence across Paragraphs

Understanding when two pieces of text convey the same information is a g...
research
02/24/2023

Cross-Lingual Transfer of Cognitive Processing Complexity

When humans read a text, their eye movements are influenced by the struc...
research
05/26/2023

Towards a Common Understanding of Contributing Factors for Cross-Lingual Transfer in Multilingual Language Models: A Review

In recent years, pre-trained Multilingual Language Models (MLLMs) have s...
research
05/25/2022

Discovering Language-neutral Sub-networks in Multilingual Language Models

Multilingual pre-trained language models perform remarkably well on cros...
research
06/05/2023

Second Language Acquisition of Neural Language Models

With the success of neural language models (LMs), their language acquisi...
research
06/02/2021

Lower Perplexity is Not Always Human-Like

In computational psycholinguistics, various language models have been ev...

Please sign up or login with your details

Forgot password? Click here to reset