Learning long-term dependencies for action recognition with a biologically-inspired deep network

11/16/2016
by   Yemin Shi, et al.
0

Despite a lot of research efforts devoted in recent years, how to efficiently learn long-term dependencies from sequences still remains a pretty challenging task. As one of the key models for sequence learning, recurrent neural network (RNN) and its variants such as long short term memory (LSTM) and gated recurrent unit (GRU) are still not powerful enough in practice. One possible reason is that they have only feedforward connections, which is different from the biological neural system that is typically composed of both feedforward and feedback connections. To address this problem, this paper proposes a biologically-inspired deep network, called shuttleNet[Our code is available at <https://github.com/shiyemin/shuttlenet>]. Technologically, the shuttleNet consists of several processors, each of which is a GRU while associated with multiple groups of cells and states. Unlike traditional RNNs, all processors inside shuttleNet are loop connected to mimic the brain's feedforward and feedback connections, in which they are shared across multiple pathways in the loop connection. Attention mechanism is then employed to select the best information flow pathway. Extensive experiments conducted on two benchmark datasets (i.e UCF101 and HMDB51) show that we can beat state-of-the-art methods by simply embedding shuttleNet into a CNN-RNN framework.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/09/2020

A bio-inspired bistable recurrent cell allows for long-lasting memory

Recurrent neural networks (RNNs) provide state-of-the-art performances i...
research
05/28/2020

Learning Various Length Dependence by Dual Recurrent Neural Networks

Recurrent neural networks (RNNs) are widely used as a memory model for s...
research
10/11/2019

Deep Independently Recurrent Neural Network (IndRNN)

Recurrent neural networks (RNNs) are known to be difficult to train due ...
research
02/14/2014

A Clockwork RNN

Sequence prediction and classification are ubiquitous and challenging pr...
research
11/01/2021

LSTA-Net: Long short-term Spatio-Temporal Aggregation Network for Skeleton-based Action Recognition

Modelling various spatio-temporal dependencies is the key to recognising...
research
08/23/2020

Variational Inference-Based Dropout in Recurrent Neural Networks for Slot Filling in Spoken Language Understanding

This paper proposes to generalize the variational recurrent neural netwo...
research
05/17/2020

How much complexity does an RNN architecture need to learn syntax-sensitive dependencies?

Long short-term memory (LSTM) networks and their variants are capable of...

Please sign up or login with your details

Forgot password? Click here to reset