GREEK-BERT: The Greeks visiting Sesame Street

08/27/2020
by   John Koutsikakis, et al.
0

Transformer-based language models, such as BERT and its variants, have achieved state-of-the-art performance in several downstream natural language processing (NLP) tasks on generic benchmark datasets (e.g., GLUE, SQUAD, RACE). However, these models have mostly been applied to the resource-rich English language. In this paper, we present GREEK-BERT, a monolingual BERT-based language model for modern Greek. We evaluate its performance in three NLP tasks, i.e., part-of-speech tagging, named entity recognition, and natural language inference, obtaining state-of-the-art performance. Interestingly, in two of the benchmarks GREEK-BERT outperforms two multilingual Transformer-based models (M-BERT, XLM-R), as well as shallower neural baselines operating on pre-trained word embeddings, by a large margin (5 make both GREEK-BERT and our training code publicly available, along with code illustrating how GREEK-BERT can be fine-tuned for downstream NLP tasks. We expect these resources to boost NLP research and applications for modern Greek.

READ FULL TEXT
research
12/19/2019

BERTje: A Dutch BERT Model

The transformer-based pre-trained language model BERT has helped to impr...
research
12/11/2019

FlauBERT: Unsupervised Language Model Pre-training for French

Language models have become a key step to achieve state-of-the-art resul...
research
01/28/2020

PEL-BERT: A Joint Model for Protocol Entity Linking

Pre-trained models such as BERT are widely used in NLP tasks and are fin...
research
04/26/2023

Impact of Position Bias on Language Models in Token Classification

Language Models (LMs) have shown state-of-the-art performance in Natural...
research
09/10/2021

FBERT: A Neural Transformer for Identifying Offensive Content

Transformer-based models such as BERT, XLNET, and XLM-R have achieved st...
research
05/07/2022

Number Entity Recognition

Numbers are essential components of text, like any other word tokens, fr...
research
07/14/2020

What's in a Name? Are BERT Named Entity Representations just as Good for any other Name?

We evaluate named entity representations of BERT-based NLP models by inv...

Please sign up or login with your details

Forgot password? Click here to reset