Semi-tensor Product-based TensorDecomposition for Neural Network Compression

09/30/2021
by   Hengling Zhao, et al.
0

The existing tensor networks adopt conventional matrix product for connection. The classical matrix product requires strict dimensionality consistency between factors, which can result in redundancy in data representation. In this paper, the semi-tensor product is used to generalize classical matrix product-based mode product to semi-tensor mode product. As it permits the connection of two factors with different dimensionality, more flexible and compact tensor decompositions can be obtained with smaller sizes of factors. Tucker decomposition, Tensor Train (TT) and Tensor Ring (TR) are common decomposition for low rank compression of deep neural networks. The semi-tensor product is applied to these tensor decompositions to obtained their generalized versions, i.e., semi-tensor Tucker decomposition (STTu), semi-tensor train(STT) and semi-tensor ring (STR). Experimental results show the STTu, STT and STR achieve higher compression factors than the conventional tensor decompositions with the same accuracy but less training times in ResNet and WideResNetcompression. With 2 and the TR-WRN (rank = 16) only obtain 3 times and99t times compression factors while the STT-RN (rank = 14) and the STR-WRN (rank = 16) achieve 9 times and 179 times compression factors, respectively.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/14/2023

A tensor SVD-like decomposition based on the semi-tensor product of tensors

In this paper, we define a semi-tensor product for third-order tensors. ...
research
11/09/2018

The trouble with tensor ring decompositions

The tensor train decomposition decomposes a tensor into a "train" of 3-w...
research
11/09/2018

Deep Compression of Sum-Product Networks on Tensor Networks

Sum-product networks (SPNs) represent an emerging class of neural networ...
research
05/26/2019

HadaNets: Flexible Quantization Strategies for Neural Networks

On-board processing elements on UAVs are currently inadequate for traini...
research
07/02/2023

TensorGPT: Efficient Compression of the Embedding Layer in LLMs based on the Tensor-Train Decomposition

High-dimensional token embeddings underpin Large Language Models (LLMs),...
research
05/09/2023

How Informative is the Approximation Error from Tensor Decomposition for Neural Network Compression?

Tensor decompositions have been successfully applied to compress neural ...
research
09/22/2020

Heuristic Rank Selection with Progressively Searching Tensor Ring Network

Recently, Tensor Ring Networks (TRNs) have been applied in deep networks...

Please sign up or login with your details

Forgot password? Click here to reset