Compressing Recurrent Neural Networks with Tensor Ring for Action Recognition

11/19/2018
by   Yu Pan, et al.
8

Recurrent Neural Networks (RNNs) and their variants, such as Long-Short Term Memory (LSTM) networks, and Gated Recurrent Unit (GRU) networks, have achieved promising performance in sequential data modeling. The hidden layers in RNNs can be regarded as the memory units, which are helpful in storing information in sequential contexts. However, when dealing with high dimensional input data, such as video and text, the input-to-hidden linear transformation in RNNs brings high memory usage and huge computational cost. This makes the training of RNNs unscalable and difficult. To address this challenge, we propose a novel compact LSTM model, named as TR-LSTM, by utilizing the low-rank tensor ring decomposition (TRD) to reformulate the input-to-hidden transformation. Compared with other tensor decomposition methods, TR-LSTM is more stable. In addition, TR-LSTM can complete an end-to-end training and also provide a fundamental building block for RNNs in handling large input data. Experiments on real-world action recognition datasets have demonstrated the promising performance of the proposed TR-LSTM compared with the tensor train LSTM and other state-of-the-art competitors.

READ FULL TEXT
research
08/21/2020

Kronecker CP Decomposition with Fast Multiplication for Compressing RNNs

Recurrent neural networks (RNNs) are powerful in the tasks oriented to s...
research
12/14/2017

Learning Compact Recurrent Neural Networks with Block-Term Tensor Decomposition

Recurrent Neural Networks (RNNs) are powerful sequence modeling tools. H...
research
07/06/2017

Tensor-Train Recurrent Neural Networks for Video Classification

The Recurrent Neural Networks and their variants have shown promising pe...
research
09/06/2019

Compact Autoregressive Network

Autoregressive networks can achieve promising performance in many sequen...
research
06/22/2018

Persistent Hidden States and Nonlinear Transformation for Long Short-Term Memory

Recurrent neural networks (RNNs) have been drawing much attention with g...
research
10/09/2020

Recurrent babbling: evaluating the acquisition of grammar from limited input data

Recurrent Neural Networks (RNNs) have been shown to capture various aspe...
research
10/28/2019

On Generalization Bounds of a Family of Recurrent Neural Networks

Recurrent Neural Networks (RNNs) have been widely applied to sequential ...

Please sign up or login with your details

Forgot password? Click here to reset