LT@Helsinki at SemEval-2020 Task 12: Multilingual or language-specific BERT?

08/03/2020
by   Marc Pàmies, et al.
0

This paper presents the different models submitted by the LT@Helsinki team for the SemEval 2020 Shared Task 12. Our team participated in sub-tasks A and C; titled offensive language identification and offense target identification, respectively. In both cases we used the so-called Bidirectional Encoder Representation from Transformer (BERT), a model pre-trained by Google and fine-tuned by us on the OLID and SOLID datasets. The results show that offensive tweet classification is one of several language-based tasks where BERT can achieve state-of-the-art results.

READ FULL TEXT
research
07/21/2020

problemConquero at SemEval-2020 Task 12: Transformer and Soft label-based approaches

In this paper, we present various systems submitted by our team problemC...
research
10/05/2020

PUM at SemEval-2020 Task 12: Aggregation of Transformer-based models' features for offensive language recognition

In this paper, we describe the PUM team's entry to the SemEval-2020 Task...
research
03/24/2021

Czert – Czech BERT-like Model for Language Representation

This paper describes the training process of the first Czech monolingual...
research
02/14/2021

indicnlp@kgp at DravidianLangTech-EACL2021: Offensive Language Identification in Dravidian Languages

The paper presents the submission of the team indicnlp@kgp to the EACL 2...
research
08/07/2023

Detecting Spells in Fantasy Literature with a Transformer Based Artificial Intelligence

Transformer architectures and models have made significant progress in l...
research
04/22/2020

Keyphrase Prediction With Pre-trained Language Model

Recently, generative methods have been widely used in keyphrase predicti...
research
07/28/2020

GUIR at SemEval-2020 Task 12: Domain-Tuned Contextualized Models for Offensive Language Detection

Offensive language detection is an important and challenging task in nat...

Please sign up or login with your details

Forgot password? Click here to reset