Kronecker CP Decomposition with Fast Multiplication for Compressing RNNs

08/21/2020
by   Dingheng Wang, et al.
0

Recurrent neural networks (RNNs) are powerful in the tasks oriented to sequential data, such as natural language processing and video recognition. However, since the modern RNNs, including long-short term memory (LSTM) and gated recurrent unit (GRU) networks, have complex topologies and expensive space/computation complexity, compressing them becomes a hot and promising topic in recent years. Among plenty of compression methods, tensor decomposition, e.g., tensor train (TT), block term (BT), tensor ring (TR) and hierarchical Tucker (HT), appears to be the most amazing approach since a very high compression ratio might be obtained. Nevertheless, none of these tensor decomposition formats can provide both the space and computation efficiency. In this paper, we consider to compress RNNs based on a novel Kronecker CANDECOMP/PARAFAC (KCP) decomposition, which is derived from Kronecker tensor (KT) decomposition, by proposing two fast algorithms of multiplication between the input and the tensor-decomposed weight. According to our experiments based on UCF11, Youtube Celebrities Face and UCF50 datasets, it can be verified that the proposed KCP-RNNs have comparable performance of accuracy with those in other tensor-decomposed formats, and even 278,219x compression ratio could be obtained by the low rank KCP. More importantly, KCP-RNNs are efficient in both space and computation complexity compared with other tensor-decomposed ones under similar ranks. Besides, we find KCP has the best potential for parallel computing to accelerate the calculations in neural networks.

READ FULL TEXT
research
11/19/2018

Compressing Recurrent Neural Networks with Tensor Ring for Action Recognition

Recurrent Neural Networks (RNNs) and their variants, such as Long-Short ...
research
08/20/2017

Neural Networks Compression for Language Modeling

In this paper, we consider several compression techniques for the langua...
research
06/09/2020

Tensor train decompositions on recurrent networks

Recurrent neural networks (RNN) such as long-short-term memory (LSTM) ne...
research
02/06/2019

Compression of Recurrent Neural Networks for Efficient Language Modeling

Recurrent neural networks have proved to be an effective method for stat...
research
07/06/2017

Tensor-Train Recurrent Neural Networks for Video Classification

The Recurrent Neural Networks and their variants have shown promising pe...
research
10/10/2020

Block-term Tensor Neural Networks

Deep neural networks (DNNs) have achieved outstanding performance in a w...
research
04/11/2021

TedNet: A Pytorch Toolkit for Tensor Decomposition Networks

Tensor Decomposition Networks(TDNs) prevail for their inherent compact a...

Please sign up or login with your details

Forgot password? Click here to reset