Extending Multilingual BERT to Low-Resource Languages

04/28/2020
by   Zihan Wang, et al.
0

Multilingual BERT (M-BERT) has been a huge success in both supervised and zero-shot cross-lingual transfer learning. However, this success has focused only on the top 104 languages in Wikipedia that it was trained on. In this paper, we propose a simple but effective approach to extend M-BERT (E-BERT) so that it can benefit any new language, and show that our approach benefits languages that are already in M-BERT as well. We perform an extensive set of experiments with Named Entity Recognition (NER) on 27 languages, only 16 of which are in M-BERT, and show an average increase of about 6 that are already in M-BERT and 23

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset