NTUA-SLP at SemEval-2018 Task 2: Predicting Emojis using RNNs with Context-aware Attention

04/18/2018
by   Christos Baziotis, et al.
0

In this paper we present a deep-learning model that competed at SemEval-2018 Task 2 "Multilingual Emoji Prediction". We participated in subtask A, in which we are called to predict the most likely associated emoji in English tweets. The proposed architecture relies on a Long Short-Term Memory network, augmented with an attention mechanism, that conditions the weight of each word, on a "context vector" which is taken as the aggregation of a tweet's meaning. Moreover, we initialize the embedding layer of our model, with word2vec word embeddings, pretrained on a dataset of 550 million English tweets. Finally, our model does not rely on hand-crafted features or lexicons and is trained end-to-end with back-propagation. We ranked 2nd out of 48 teams.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/18/2018

NTUA-SLP at SemEval-2018 Task 3: Tracking Ironic Tweets using Ensembles of Word and Character Level Attentive RNNs

In this paper we present two deep-learning systems that competed at SemE...
research
04/18/2018

NTUA-SLP at SemEval-2018 Task 1: Predicting Affective Content in Tweets with Deep Attentive RNNs and Transfer Learning

In this paper we present deep-learning models that submitted to the SemE...
research
10/03/2018

A Comparative Study of Neural Network Models for Sentence Classification

This paper presents an extensive comparative study of four neural networ...
research
08/28/2020

QutNocturnal@HASOC'19: CNN for Hate Speech and Offensive Content Identification in Hindi Language

We describe our top-team solution to Task 1 for Hindi in the HASOC conte...
research
04/15/2023

TransDocs: Optical Character Recognition with word to word translation

While OCR has been used in various applications, its output is not alway...
research
01/10/2019

Context Aware Machine Learning

We propose a principle for exploring context in machine learning models....
research
06/24/2019

Is It Worth the Attention? A Comparative Evaluation of Attention Layers for Argument Unit Segmentation

Attention mechanisms have seen some success for natural language process...

Please sign up or login with your details

Forgot password? Click here to reset