NTUA-SLP at SemEval-2018 Task 2: Predicting Emojis using RNNs with Context-aware Attention

by   Christos Baziotis, et al.

In this paper we present a deep-learning model that competed at SemEval-2018 Task 2 "Multilingual Emoji Prediction". We participated in subtask A, in which we are called to predict the most likely associated emoji in English tweets. The proposed architecture relies on a Long Short-Term Memory network, augmented with an attention mechanism, that conditions the weight of each word, on a "context vector" which is taken as the aggregation of a tweet's meaning. Moreover, we initialize the embedding layer of our model, with word2vec word embeddings, pretrained on a dataset of 550 million English tweets. Finally, our model does not rely on hand-crafted features or lexicons and is trained end-to-end with back-propagation. We ranked 2nd out of 48 teams.



There are no comments yet.


page 1

page 2

page 3

page 4


NTUA-SLP at SemEval-2018 Task 3: Tracking Ironic Tweets using Ensembles of Word and Character Level Attentive RNNs

In this paper we present two deep-learning systems that competed at SemE...

NTUA-SLP at SemEval-2018 Task 1: Predicting Affective Content in Tweets with Deep Attentive RNNs and Transfer Learning

In this paper we present deep-learning models that submitted to the SemE...

A Comparative Study of Neural Network Models for Sentence Classification

This paper presents an extensive comparative study of four neural networ...

QutNocturnal@HASOC'19: CNN for Hate Speech and Offensive Content Identification in Hindi Language

We describe our top-team solution to Task 1 for Hindi in the HASOC conte...

Gender Prediction from Tweets: Improving Neural Representations with Hand-Crafted Features

Author profiling is the characterization of an author through some key a...

Context Aware Machine Learning

We propose a principle for exploring context in machine learning models....

Is It Worth the Attention? A Comparative Evaluation of Attention Layers for Argument Unit Segmentation

Attention mechanisms have seen some success for natural language process...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.