Can Monolingual Pretrained Models Help Cross-Lingual Classification?

11/10/2019
by   Zewen Chi, et al.
0

Multilingual pretrained language models (such as multilingual BERT) have achieved impressive results for cross-lingual transfer. However, due to the constant model capacity, multilingual pre-training usually lags behind the monolingual competitors. In this work, we present two approaches to improve zero-shot cross-lingual classification, by transferring the knowledge from monolingual pretrained models to multilingual ones. Experimental results on two cross-lingual classification benchmarks show that our methods outperform vanilla multilingual fine-tuning.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/22/2023

L3Cube-IndicSBERT: A simple approach for learning cross-lingual sentence representations using multilingual BERT

The multilingual Sentence-BERT (SBERT) models map different languages to...
research
02/28/2022

Cross-Lingual Text Classification with Multilingual Distillation and Zero-Shot-Aware Training

Multilingual pre-trained language models (MPLMs) not only can handle tas...
research
01/26/2021

First Align, then Predict: Understanding the Cross-Lingual Ability of Multilingual BERT

Multilingual pretrained language models have demonstrated remarkable zer...
research
09/04/2021

On the ability of monolingual models to learn language-agnostic representations

Pretrained multilingual models have become a de facto default approach f...
research
09/22/2022

MonoByte: A Pool of Monolingual Byte-level Language Models

The zero-shot cross-lingual ability of models pretrained on multilingual...
research
09/15/2021

Cross-lingual Transfer of Monolingual Models

Recent studies in zero-shot cross-lingual learning using multilingual mo...
research
09/10/2019

MultiFiT: Efficient Multi-lingual Language Model Fine-tuning

Pretrained language models are promising particularly for low-resource l...

Please sign up or login with your details

Forgot password? Click here to reset