Cross-lingual Alignment Methods for Multilingual BERT: A Comparative Study

09/29/2020
by   Saurabh Kulshreshtha, et al.
0

Multilingual BERT (mBERT) has shown reasonable capability for zero-shot cross-lingual transfer when fine-tuned on downstream tasks. Since mBERT is not pre-trained with explicit cross-lingual supervision, transfer performance can further be improved by aligning mBERT with cross-lingual signal. Prior work proposes several approaches to align contextualised embeddings. In this paper we analyse how different forms of cross-lingual supervision and various alignment methods influence the transfer capability of mBERT in zero-shot setting. Specifically, we compare parallel corpora vs. dictionary-based supervision and rotational vs. fine-tuning based alignment methods. We evaluate the performance of different alignment methodologies across eight languages on two tasks: Name Entity Recognition and Semantic Slot Filling. In addition, we propose a novel normalisation method which consistently improves the performance of rotation-based alignment including a notable 3 for distant and typologically dissimilar languages. Importantly we identify the biases of the alignment methods to the type of task and proximity to the transfer language. We also find that supervision from parallel corpus is generally superior to dictionary alignments.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/17/2022

Feature Aggregation in Zero-Shot Cross-Lingual Transfer Using Multilingual BERT

Multilingual BERT (mBERT), a language model pre-trained on large multili...
research
04/29/2020

End-to-End Slot Alignment and Recognition for Cross-Lingual NLU

Natural language understanding in the context of goal oriented dialog sy...
research
06/05/2023

Exploring the Relationship between Alignment and Cross-lingual Transfer in Multilingual Transformers

Without any explicit cross-lingual training data, multilingual language ...
research
04/30/2020

On the Evaluation of Contextual Embeddings for Zero-Shot Cross-Lingual Transfer Learning

Pre-trained multilingual contextual embeddings have demonstrated state-o...
research
10/06/2021

Sequential Reptile: Inter-Task Gradient Alignment for Multilingual Learning

Multilingual models jointly pretrained on multiple languages have achiev...
research
06/09/2023

WSPAlign: Word Alignment Pre-training via Large-Scale Weakly Supervised Span Prediction

Most existing word alignment methods rely on manual alignment datasets o...
research
10/22/2022

EntityCS: Improving Zero-Shot Cross-lingual Transfer with Entity-Centric Code Switching

Accurate alignment between languages is fundamental for improving cross-...

Please sign up or login with your details

Forgot password? Click here to reset