NTUA-SLP at SemEval-2018 Task 1: Predicting Affective Content in Tweets with Deep Attentive RNNs and Transfer Learning

by   Christos Baziotis, et al.

In this paper we present deep-learning models that submitted to the SemEval-2018 Task 1 competition: "Affect in Tweets". We participated in all subtasks for English tweets. We propose a Bi-LSTM architecture equipped with a multi-layer self attention mechanism. The attention mechanism improves the model performance and allows us to identify salient words in tweets, as well as gain insight into the models making them more interpretable. Our model utilizes a set of word2vec word embeddings trained on a large collection of 550 million Twitter messages, augmented by a set of word affective features. Due to the limited amount of task-specific training data, we opted for a transfer learning approach by pretraining the Bi-LSTMs on the dataset of Semeval 2017, Task 4A. The proposed approach ranked 1st in Subtask E "Multi-Label Emotion Classification", 2nd in Subtask A "Emotion Intensity Regression" and achieved competitive results in other subtasks.



There are no comments yet.


page 1

page 2

page 3

page 4


NTUA-SLP at SemEval-2018 Task 3: Tracking Ironic Tweets using Ensembles of Word and Character Level Attentive RNNs

In this paper we present two deep-learning systems that competed at SemE...

NTUA-SLP at IEST 2018: Ensemble of Neural Transfer Methods for Implicit Emotion Classification

In this paper we present our approach to tackle the Implicit Emotion Sha...

Exploring Deep Neural Networks and Transfer Learning for Analyzing Emotions in Tweets

In this paper, we present an experiment on using deep learning and trans...

NTUA-SLP at SemEval-2018 Task 2: Predicting Emojis using RNNs with Context-aware Attention

In this paper we present a deep-learning model that competed at SemEval-...

Clickbait Detection in Tweets Using Self-attentive Network

Clickbait detection in tweets remains an elusive challenge. In this pape...

NL-FIIT at SemEval-2019 Task 9: Neural Model Ensemble for Suggestion Mining

In this paper, we present neural model architecture submitted to the Sem...

Amobee at IEST 2018: Transfer Learning from Language Models

This paper describes the system developed at Amobee for the WASSA 2018 i...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.