Evaluating Multilingual BERT for Estonian

10/01/2020
by   Claudia Kittask, et al.
0

Recently, large pre-trained language models, such as BERT, have reached state-of-the-art performance in many natural language processing tasks, but for many languages, including Estonian, BERT models are not yet available. However, there exist several multilingual BERT models that can handle multiple languages simultaneously and that have been trained also on Estonian data. In this paper, we evaluate four multilingual models—multilingual BERT, multilingual distilled BERT, XLM and XLM-RoBERTa—on several NLP tasks including POS and morphological tagging, NER and text classification. Our aim is to establish a comparison between these multilingual BERT models and the existing baseline neural models for these tasks. Our results show that multilingual BERT models can generalise well on different Estonian NLP tasks outperforming all baselines models for POS and morphological tagging and text classification, and reaching the comparable level with the best baseline for NER, with XLM-RoBERTa achieving the highest results compared with other multilingual models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/09/2020

EstBERT: A Pretrained Language-Specific BERT for Estonian

This paper presents EstBERT, a large pretrained transformer-based langua...
research
02/22/2021

Evaluating Contextualized Language Models for Hungarian

We present an extended comparison of contextualized language models for ...
research
08/31/2019

Small and Practical BERT Models for Sequence Labeling

We propose a practical scheme to train a single multilingual sequence la...
research
01/10/2022

BERT for Sentiment Analysis: Pre-trained and Fine-Tuned Alternatives

BERT has revolutionized the NLP field by enabling transfer learning with...
research
10/12/2020

Load What You Need: Smaller Versions of Multilingual BERT

Pre-trained Transformer-based models are achieving state-of-the-art resu...
research
06/09/2023

Morphosyntactic probing of multilingual BERT models

We introduce an extensive dataset for multilingual probing of morphologi...
research
03/31/2020

Give your Text Representation Models some Love: the Case for Basque

Word embeddings and pre-trained language models allow to build rich repr...

Please sign up or login with your details

Forgot password? Click here to reset