Can Monolingual Pretrained Models Help Cross-Lingual Classification?

11/10/2019
by   Zewen Chi, et al.
0

Multilingual pretrained language models (such as multilingual BERT) have achieved impressive results for cross-lingual transfer. However, due to the constant model capacity, multilingual pre-training usually lags behind the monolingual competitors. In this work, we present two approaches to improve zero-shot cross-lingual classification, by transferring the knowledge from monolingual pretrained models to multilingual ones. Experimental results on two cross-lingual classification benchmarks show that our methods outperform vanilla multilingual fine-tuning.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset