Self-Attention Neural Bag-of-Features

01/26/2022
by   Kateryna Chumachenko, et al.
12

In this work, we propose several attention formulations for multivariate sequence data. We build on top of the recently introduced 2D-Attention and reformulate the attention learning methodology by quantifying the relevance of feature/temporal dimensions through latent spaces based on self-attention rather than learning them directly. In addition, we propose a joint feature-temporal attention mechanism that learns a joint 2D attention mask highlighting relevant information without treating feature and temporal representations independently. The proposed approaches can be used in various architectures and we specifically evaluate their application together with Neural Bag of Features feature extraction module. Experiments on several sequence data analysis tasks show the improved performance yielded by our approach compared to standard methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/25/2021

HAN: An Efficient Hierarchical Self-Attention Network for Skeleton-Based Gesture Recognition

Previous methods for skeleton-based gesture recognition mostly arrange t...
research
05/25/2020

Attention-based Neural Bag-of-Features Learning for Sequence Data

In this paper, we propose 2D-Attention (2DA), a generic attention formul...
research
07/19/2021

Action Forecasting with Feature-wise Self-Attention

We present a new architecture for human action forecasting from videos. ...
research
02/11/2020

Feature Importance Estimation with Self-Attention Networks

Black-box neural network models are widely used in industry and science,...
research
04/23/2019

Relevant feature extraction for statistical inference

We introduce an algorithm that learns correlations between two datasets,...
research
10/19/2020

Attention Augmented ConvLSTM for Environment Prediction

Safe and proactive planning in robotic systems generally requires accura...
research
01/20/2021

Classifying Scientific Publications with BERT – Is Self-Attention a Feature Selection Method?

We investigate the self-attention mechanism of BERT in a fine-tuning sce...

Please sign up or login with your details

Forgot password? Click here to reset