Self-Attention: A Better Building Block for Sentiment Analysis Neural Network Classifiers

12/19/2018
by   Artaches Ambartsoumian, et al.
0

Sentiment Analysis has seen much progress in the past two decades. For the past few years, neural network approaches, primarily RNNs and CNNs, have been the most successful for this task. Recently, a new category of neural networks, self-attention networks (SANs), have been created which utilizes the attention mechanism as the basic building block. Self-attention networks have been shown to be effective for sequence modeling tasks, while having no recurrence or convolutions. In this work we explore the effectiveness of the SANs for sentiment analysis. We demonstrate that SANs are superior in performance to their RNN and CNN counterparts by comparing their classification accuracy on six datasets as well as their model characteristics such as training speed and memory consumption. Finally, we explore the effects of various SAN modifications such as multi-head attention as well as two methods of incorporating sequence position information into SANs.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/29/2019

An Efficient Model for Sentiment Analysis of Electronic Product Reviews in Vietnamese

In the past few years, the growth of e-commerce and digital marketing in...
research
03/12/2020

Sentiment Analysis with Contextual Embeddings and Self-Attention

In natural language the intended meaning of a word or phrase is often im...
research
10/10/2020

Structured Self-Attention Weights Encode Semantics in Sentiment Analysis

Neural attention, especially the self-attention made popular by the Tran...
research
02/22/2021

Parallelizing Legendre Memory Unit Training

Recently, a new recurrent neural network (RNN) named the Legendre Memory...
research
02/11/2022

Hindi/Bengali Sentiment Analysis Using Transfer Learning and Joint Dual Input Learning with Self Attention

Sentiment Analysis typically refers to using natural language processing...
research
04/12/2018

Amobee at SemEval-2018 Task 1: GRU Neural Network with a CNN Attention Mechanism for Sentiment Classification

This paper describes the participation of Amobee in the shared sentiment...
research
12/20/2020

LieTransformer: Equivariant self-attention for Lie Groups

Group equivariant neural networks are used as building blocks of group i...

Please sign up or login with your details

Forgot password? Click here to reset