Exploring Transformers in Emotion Recognition: a comparison of BERT, DistillBERT, RoBERTa, XLNet and ELECTRA

04/05/2021
by   Diogo Cortiz, et al.
12

This paper investigates how Natural Language Understanding (NLU) could be applied in Emotion Recognition, a specific task in affective computing. We finetuned different transformers language models (BERT, DistilBERT, RoBERTa, XLNet, and ELECTRA) using a fine-grained emotion dataset and evaluating them in terms of performance (f1-score) and time to complete.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/31/2021

Using Knowledge-Embedded Attention to Augment Pre-trained Language Models for Fine-Grained Emotion Recognition

Modern emotion recognition systems are trained to recognize only a small...
research
08/17/2021

A Weak Supervised Dataset of Fine-Grained Emotions in Portuguese

Affective Computing is the study of how computers can recognize, interpr...
research
08/09/2022

Emotion Detection From Tweets Using a BERT and SVM Ensemble Model

Automatic identification of emotions expressed in Twitter data has a wid...
research
05/27/2023

ArPanEmo: An Open-Source Dataset for Fine-Grained Emotion Recognition in Arabic Online Content during COVID-19 Pandemic

Emotion recognition is a crucial task in Natural Language Processing (NL...
research
10/06/2021

Federated Distillation of Natural Language Understanding with Confident Sinkhorns

Enhancing the user experience is an essential task for application servi...
research
08/21/2023

Refashioning Emotion Recognition Modelling: The Advent of Generalised Large Models

After the inception of emotion recognition or affective computing, it ha...
research
10/31/2018

Deep Net Features for Complex Emotion Recognition

This paper investigates the influence of different acoustic features, au...

Please sign up or login with your details

Forgot password? Click here to reset