RobBERT: a Dutch RoBERTa-based Language Model

01/17/2020
by   Pieter Delobelle, et al.
0

Pre-trained language models have been dominating the field of natural language processing in recent years, and have led to significant performance gains for various complex natural language tasks. One of the most prominent pre-trained language models is BERT (Bi-directional Encoders for Transformers), which was released as an English as well as a multilingual version. Although multilingual BERT performs well on many tasks, recent studies showed that BERT models trained on a single language significantly outperform the multilingual results. Training a Dutch BERT model thus has a lot of potential for a wide range of Dutch NLP tasks. While previous approaches have used earlier implementations of BERT to train their Dutch BERT, we used RoBERTa, a robustly optimized BERT approach, to train a Dutch language model called RobBERT. We show that RobBERT improves state of the art results in Dutch-specific language tasks, and also outperforms other existing Dutch BERT-based models in sentiment analysis. These results indicate that RobBERT is a powerful pre-trained model for fine-tuning for a large variety of Dutch language tasks. We publicly release this pre-trained model in hope of supporting further downstream Dutch NLP applications.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/06/2023

Spanish Pre-trained BERT Model and Evaluation Data

The Spanish language is one of the top 5 spoken languages in the world. ...
research
02/18/2020

From English To Foreign Languages: Transferring Pre-trained Language Models

Pre-trained models have demonstrated their effectiveness in many downstr...
research
06/24/2023

Comparison of Pre-trained Language Models for Turkish Address Parsing

Transformer based pre-trained models such as BERT and its variants, whic...
research
05/17/2019

Story Ending Prediction by Transferable BERT

Recent advances, such as GPT and BERT, have shown success in incorporati...
research
07/03/2023

ALBERTI, a Multilingual Domain Specific Language Model for Poetry Analysis

The computational analysis of poetry is limited by the scarcity of tools...
research
03/24/2023

Chat2VIS: Fine-Tuning Data Visualisations using Multilingual Natural Language Text and Pre-Trained Large Language Models

The explosion of data in recent years is driving individuals to leverage...
research
05/24/2021

Neural Language Models for Nineteenth-Century English

We present four types of neural language models trained on a large histo...

Please sign up or login with your details

Forgot password? Click here to reset