DeepAI AI Chat
Log In Sign Up

Attention-based Neural Bag-of-Features Learning for Sequence Data

by   Dat Thanh Tran, et al.
Aarhus Universitet
Tampere Universities

In this paper, we propose 2D-Attention (2DA), a generic attention formulation for sequence data, which acts as a complementary computation block that can detect and focus on relevant sources of information for the given learning objective. The proposed attention module is incorporated into the recently proposed Neural Bag of Feature (NBoF) model to enhance its learning capacity. Since 2DA acts as a plug-in layer, injecting it into different computation stages of the NBoF model results in different 2DA-NBoF architectures, each of which possesses a unique interpretation. We conducted extensive experiments in financial forecasting, audio analysis as well as medical diagnosis problems to benchmark the proposed formulations in comparison with existing methods, including the widely used Gated Recurrent Units. Our empirical analysis shows that the proposed attention formulations can not only improve performances of NBoF models but also make them resilient to noisy data.


page 1

page 2

page 3

page 4


Self-Attention Neural Bag-of-Features

In this work, we propose several attention formulations for multivariate...

Introducing Feature Attention Module on Convolutional Neural Network for Diabetic Retinopathy Detection

Diabetic retinopathy (DR) is a leading cause of blindness among diabetic...

Deepening Neural Networks Implicitly and Locally via Recurrent Attention Strategy

More and more empirical and theoretical evidence shows that deepening ne...

Audio Set classification with attention model: A probabilistic perspective

This paper investigate the classification of the Audio Set dataset. Audi...

SimpleTron: Eliminating Softmax from Attention Computation

In this paper, we propose that the dot product pairwise matching attenti...