EleAtt-RNN: Adding Attentiveness to Neurons in Recurrent Neural Networks

09/03/2019
by   Pengfei Zhang, et al.
8

Recurrent neural networks (RNNs) are capable of modeling temporal dependencies of complex sequential data. In general, current available structures of RNNs tend to concentrate on controlling the contributions of current and previous information. However, the exploration of different importance levels of different elements within an input vector is always ignored. We propose a simple yet effective Element-wise-Attention Gate (EleAttG), which can be easily added to an RNN block (e.g. all RNN neurons in an RNN layer), to empower the RNN neurons to have attentiveness capability. For an RNN block, an EleAttG is used for adaptively modulating the input by assigning different levels of importance, i.e., attention, to each element/dimension of the input. We refer to an RNN block equipped with an EleAttG as an EleAtt-RNN block. Instead of modulating the input as a whole, the EleAttG modulates the input at fine granularity, i.e., element-wise, and the modulation is content adaptive. The proposed EleAttG, as an additional fundamental unit, is general and can be applied to any RNN structures, e.g., standard RNN, Long Short-Term Memory (LSTM), or Gated Recurrent Unit (GRU). We demonstrate the effectiveness of the proposed EleAtt-RNN by applying it to different tasks including the action recognition, from both skeleton-based data and RGB videos, gesture recognition, and sequential MNIST classification. Experiments show that adding attentiveness through EleAttGs to RNN blocks significantly improves the power of RNNs.

READ FULL TEXT

page 1

page 6

page 8

page 9

research
07/12/2018

Adding Attentiveness to the Neurons in Recurrent Neural Networks

Recurrent neural networks (RNNs) are capable of modeling the temporal dy...
research
04/11/2016

Deep Gate Recurrent Neural Network

This paper introduces two recurrent neural network structures called Sim...
research
03/13/2018

Independently Recurrent Neural Network (IndRNN): Building A Longer and Deeper RNN

Recurrent neural networks (RNNs) have been widely used for processing se...
research
10/30/2018

Recurrent Attention Unit

Recurrent Neural Network (RNN) has been successfully applied in many seq...
research
03/27/2019

Recurrent Neural Networks For Accurate RSSI Indoor Localization

This paper proposes recurrent neuron networks (RNNs) for a fingerprintin...
research
06/12/2022

ChordMixer: A Scalable Neural Attention Model for Sequences with Different Lengths

Sequential data naturally have different lengths in many domains, with s...
research
11/26/2021

On Recurrent Neural Networks for learning-based control: recent results and ideas for future developments

This paper aims to discuss and analyze the potentialities of Recurrent N...

Please sign up or login with your details

Forgot password? Click here to reset