Hybrid Tensor Decomposition in Neural Network Compression

06/29/2020
by   Bijiao Wu, et al.
0

Deep neural networks (DNNs) have enabled impressive breakthroughs in various artificial intelligence (AI) applications recently due to its capability of learning high-level features from big data. However, the current demand of DNNs for computational resources especially the storage consumption is growing due to that the increasing sizes of models are being required for more and more complicated applications. To address this problem, several tensor decomposition methods including tensor-train (TT) and tensor-ring (TR) have been applied to compress DNNs and shown considerable compression effectiveness. In this work, we introduce the hierarchical Tucker (HT), a classical but rarely-used tensor decomposition method, to investigate its capability in neural network compression. We convert the weight matrices and convolutional kernels to both HT and TT formats for comparative study, since the latter is the most widely used decomposition method and the variant of HT. We further theoretically and experimentally discover that the HT format has better performance on compressing weight matrices, while the TT format is more suited for compressing convolutional kernels. Based on this phenomenon we propose a strategy of hybrid tensor decomposition by combining TT and HT together to compress convolutional and fully connected parts separately and attain better accuracy than only using the TT or HT format on convolutional neural networks (CNNs). Our work illuminates the prospects of hybrid tensor decomposition for neural network compression.

READ FULL TEXT

page 1

page 3

page 4

page 5

page 6

page 9

page 13

page 14

research
12/08/2019

Lossless Compression for 3DCNNs Based on Tensor Train Decomposition

Three dimensional convolutional neural networks (3DCNNs) have been appli...
research
07/26/2021

Towards Efficient Tensor Decomposition-Based DNN Model Compression with Optimization Framework

Advanced tensor decomposition, such as Tensor train (TT) and Tensor ring...
research
03/05/2022

How to Train Unstable Looped Tensor Network

A rising problem in the compression of Deep Neural Networks is how to re...
research
11/12/2021

Nonlinear Tensor Ring Network

The state-of-the-art deep neural networks (DNNs) have been widely applie...
research
01/14/2020

Understanding Generalization in Deep Learning via Tensor Methods

Deep neural networks generalize well on unseen data though the number of...
research
05/25/2018

Tensorized Spectrum Preserving Compression for Neural Networks

Modern neural networks can have tens of millions of parameters, and are ...
research
05/16/2018

End-to-end Learning of a Convolutional Neural Network via Deep Tensor Decomposition

In this paper we study the problem of learning the weights of a deep con...

Please sign up or login with your details

Forgot password? Click here to reset