Evaluation of contextual embeddings on less-resourced languages

07/22/2021
by   Matej Ulčar, et al.
0

The current dominance of deep neural networks in natural language processing is based on contextual embeddings such as ELMo, BERT, and BERT derivatives. Most existing work focuses on English; in contrast, we present here the first multilingual empirical comparison of two ELMo and several monolingual and multilingual BERT models using 14 tasks in nine languages. In monolingual settings, our analysis shows that monolingual BERT models generally dominate, with a few exceptions such as the dependency parsing task, where they are not competitive with ELMo models trained on large corpora. In cross-lingual settings, BERT models trained on only a few languages mostly do best, closely followed by massively multilingual BERT models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/14/2020

FinEst BERT and CroSloEngual BERT: less is more in multilingual models

Large pretrained masked language models have become state-of-the-art sol...
research
06/13/2023

Monolingual and Cross-Lingual Knowledge Transfer for Topic Classification

This article investigates the knowledge transfer from the RuQTopics data...
research
04/23/2021

Towards Trustworthy Deception Detection: Benchmarking Model Robustness across Domains, Modalities, and Languages

Evaluating model robustness is critical when developing trustworthy mode...
research
07/27/2021

gaBERT – an Irish Language Model

The BERT family of neural language models have become highly popular due...
research
10/09/2019

Is Multilingual BERT Fluent in Language Generation?

The multilingual BERT model is trained on 104 languages and meant to ser...
research
10/22/2020

Towards Fully Bilingual Deep Language Modeling

Language models based on deep neural networks have facilitated great adv...
research
08/31/2021

Monolingual versus Multilingual BERTology for Vietnamese Extractive Multi-Document Summarization

Recent researches have demonstrated that BERT shows potential in a wide ...

Please sign up or login with your details

Forgot password? Click here to reset