EmotionX-HSU: Adopting Pre-trained BERT for Emotion Classification

07/23/2019
by   Linkai Luo, et al.
0

This paper describes our approach to the EmotionX-2019, the shared task of SocialNLP 2019. To detect emotion for each utterance of two datasets from the TV show Friends and Facebook chat log EmotionPush, we propose two-step deep learning based methodology: (i) encode each of the utterance into a sequence of vectors that represent its meaning; and (ii) use a simply softmax classifier to predict one of the emotions amongst four candidates that an utterance may carry. Notice that the source of labeled utterances is not rich, we utilise a well-trained model, known as BERT, to transfer part of the knowledge learned from a large amount of corpus to our model. We then focus on fine-tuning our model until it well fits to the in-domain data. The performance of the proposed model is evaluated by micro-F1 scores, i.e., 79.1 of Friends and EmotionPush, respectively. Our model ranks 3rd among 11 submissions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/17/2019

EmotionX-IDEA: Emotion BERT -- an Affectional Model for Conversation

In this paper, we investigate the emotion recognition ability of the pre...
research
02/23/2018

EmotionLines: An Emotion Corpus of Multi-Party Conversations

Feeling emotion is a critical characteristic to distinguish people from ...
research
01/23/2022

A Pre-trained Audio-Visual Transformer for Emotion Recognition

In this paper, we introduce a pretrained audio-visual Transformer traine...
research
03/30/2019

ANA at SemEval-2019 Task 3: Contextual Emotion detection in Conversations through hierarchical LSTMs and BERT

This paper describes the system submitted by ANA Team for the SemEval-20...
research
10/25/2019

DENS: A Dataset for Multi-class Emotion Analysis

We introduce a new dataset for multi-class emotion analysis from long-fo...
research
06/27/2019

EmotionX-KU: BERT-Max based Contextual Emotion Classifier

We propose a contextual emotion classifier based on a transferable langu...
research
11/10/2021

BagBERT: BERT-based bagging-stacking for multi-topic classification

This paper describes our submission on the COVID-19 literature annotatio...

Please sign up or login with your details

Forgot password? Click here to reset