Long Short-Term Attention

10/30/2018
by   Guoqiang Zhong, et al.
0

In order to learn effective features from temporal sequences, the long short-term memory (LSTM) network is widely applied. A critical component of LSTM is the memory cell, which is able to extract, process and store temporal information. Nevertheless, in LSTM, the memory cell is not directly enforced to pay attention to a part of the sequence. Alternatively, the attention mechanism can help to pay attention to specific information of data. In this paper, we present a novel neural model, called long short-term attention (LSTA), which seamlessly merges the attention mechanism into LSTM. More than processing long short term sequences, it can distill effective and valuable information from the sequences with the attention mechanism. Experiments show that LSTA achieves promising learning performance in various deep learning tasks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/08/2017

LSTM Fully Convolutional Networks for Time Series Classification

Fully convolutional neural networks (FCN) have been shown to achieve sta...
research
03/06/2015

Convolutional LSTM Networks for Subcellular Localization of Proteins

Machine learning is widely used to analyze biological sequence data. Non...
research
07/06/2015

Grid Long Short-Term Memory

This paper introduces Grid Long Short-Term Memory, a network of LSTM cel...
research
08/16/2018

LARNN: Linear Attention Recurrent Neural Network

The Linear Attention Recurrent Neural Network (LARNN) is a recurrent att...
research
11/26/2016

Attention-based Memory Selection Recurrent Network for Language Modeling

Recurrent neural networks (RNNs) have achieved great success in language...
research
05/31/2018

Attention-Based LSTM for Psychological Stress Detection from Spoken Language Using Distant Supervision

We propose a Long Short-Term Memory (LSTM) with attention mechanism to c...
research
09/14/2020

Collaborative Attention Mechanism for Multi-View Action Recognition

Multi-view action recognition (MVAR) leverages complementary temporal in...

Please sign up or login with your details

Forgot password? Click here to reset