Feature Aggregation in Zero-Shot Cross-Lingual Transfer Using Multilingual BERT

05/17/2022
by   Beiduo Chen, et al.
0

Multilingual BERT (mBERT), a language model pre-trained on large multilingual corpora, has impressive zero-shot cross-lingual transfer capabilities and performs surprisingly well on zero-shot POS tagging and Named Entity Recognition (NER), as well as on cross-lingual model transfer. At present, the mainstream methods to solve the cross-lingual downstream tasks are always using the last transformer layer's output of mBERT as the representation of linguistic information. In this work, we explore the complementary property of lower layers to the last transformer layer of mBERT. A feature aggregation module based on an attention mechanism is proposed to fuse the information contained in different layers of mBERT. The experiments are conducted on four zero-shot cross-lingual transfer datasets, and the proposed method obtains performance improvements on key multilingual benchmark tasks XNLI (+1.5 PAWS-X (+2.4 experimental results, we prove that the layers before the last layer of mBERT can provide extra useful information for cross-lingual downstream tasks and explore the interpretability of mBERT empirically.

READ FULL TEXT

page 1

page 2

page 3

page 4

page 5

page 6

page 7

research
04/17/2021

Improving Zero-Shot Cross-Lingual Transfer Learning via Robust Training

In recent years, pre-trained multilingual language models, such as multi...
research
09/29/2020

Cross-lingual Alignment Methods for Multilingual BERT: A Comparative Study

Multilingual BERT (mBERT) has shown reasonable capability for zero-shot ...
research
04/10/2022

Few-Shot Cross-lingual Transfer for Coarse-grained De-identification of Code-Mixed Clinical Texts

Despite the advances in digital healthcare systems offering curated stru...
research
03/18/2020

X-Stance: A Multilingual Multi-Target Dataset for Stance Detection

We extract a large-scale stance detection dataset from comments written ...
research
05/09/2022

A Balanced Data Approach for Evaluating Cross-Lingual Transfer: Mapping the Linguistic Blood Bank

We show that the choice of pretraining languages affects downstream cros...
research
09/14/2021

Everything Is All It Takes: A Multipronged Strategy for Zero-Shot Cross-Lingual Information Extraction

Zero-shot cross-lingual information extraction (IE) describes the constr...
research
04/30/2020

A Matter of Framing: The Impact of Linguistic Formalism on Probing Results

Deep pre-trained contextualized encoders like BERT (Delvin et al., 2019)...

Please sign up or login with your details

Forgot password? Click here to reset