Low-Rank Embedding of Kernels in Convolutional Neural Networks under Random Shuffling

10/31/2018
by   Chao Li, et al.
0

Although the convolutional neural networks (CNNs) have become popular for various image processing and computer vision task recently, it remains a challenging problem to reduce the storage cost of the parameters for resource-limited platforms. In the previous studies, tensor decomposition (TD) has achieved promising compression performance by embedding the kernel of a convolutional layer into a low-rank subspace. However the employment of TD is naively on the kernel or its specified variants. Unlike the conventional approaches, this paper shows that the kernel can be embedded into more general or even random low-rank subspaces. We demonstrate this by compressing the convolutional layers via randomly-shuffled tensor decomposition (RsTD) for a standard classification task using CIFAR-10. In addition, we analyze how the spatial similarity of the training data influences the low-rank structure of the kernels. The experimental results show that the CNN can be significantly compressed even if the kernels are randomly shuffled. Furthermore, the RsTD-based method yields more stable classification accuracy than the conventional TD-based methods in a large range of compression ratios.

READ FULL TEXT

page 1

page 2

page 3

page 4

page 5

research
08/12/2020

Stable Low-rank Tensor Decomposition for Compression of Convolutional Neural Network

Most state of the art deep neural networks are overparameterized and exh...
research
02/28/2020

HOTCAKE: Higher Order Tucker Articulated Kernels for Deeper CNN Compression

The emerging edge computing has promoted immense interests in compacting...
research
03/23/2018

Iterative Low-Rank Approximation for CNN Compression

Deep convolutional neural networks contain tens of millions of parameter...
research
11/18/2019

Grassmannian Packings in Neural Networks: Learning with Maximal Subspace Packings for Diversity and Anti-Sparsity

Kernel sparsity ("dying ReLUs") and lack of diversity are commonly obser...
research
11/19/2015

Convolutional neural networks with low-rank regularization

Large CNNs have delivered impressive performance in various computer vis...
research
05/15/2014

Speeding up Convolutional Neural Networks with Low Rank Expansions

The focus of this paper is speeding up the evaluation of convolutional n...
research
01/26/2023

Low-Rank Winograd Transformation for 3D Convolutional Neural Networks

This paper focuses on Winograd transformation in 3D convolutional neural...

Please sign up or login with your details

Forgot password? Click here to reset