Towards Compact CNNs via Collaborative Compression

05/24/2021
by   Yuchao Li, et al.
0

Channel pruning and tensor decomposition have received extensive attention in convolutional neural network compression. However, these two techniques are traditionally deployed in an isolated manner, leading to significant accuracy drop when pursuing high compression rates. In this paper, we propose a Collaborative Compression (CC) scheme, which joints channel pruning and tensor decomposition to compress CNN models by simultaneously learning the model sparsity and low-rankness. Specifically, we first investigate the compression sensitivity of each layer in the network, and then propose a Global Compression Rate Optimization that transforms the decision problem of compression rate into an optimization problem. After that, we propose multi-step heuristic compression to remove redundant compression units step-by-step, which fully considers the effect of the remaining compression space (i.e., unremoved compression units). Our method demonstrates superior performance gains over previous ones on various datasets and backbone architectures. For example, we achieve 52.9 only a Top-1 accuracy drop of 0.56

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/11/2018

Exploiting Kernel Sparsity and Entropy for Interpretable CNN Compression

Compressing convolutional neural networks (CNNs) has received ever-incre...
research
09/29/2021

Convolutional Neural Network Compression through Generalized Kronecker Product Decomposition

Modern Convolutional Neural Network (CNN) architectures, despite their s...
research
06/07/2022

Neural Network Compression via Effective Filter Analysis and Hierarchical Pruning

Network compression is crucial to making the deep networks to be more ef...
research
02/19/2020

Model-Agnostic Structured Sparsification with Learnable Channel Shuffle

Recent advances in convolutional neural networks (CNNs) usually come wit...
research
03/19/2020

Group Sparsity: The Hinge Between Filter Pruning and Decomposition for Network Compression

In this paper, we analyze two popular network compression techniques, i....
research
10/22/2020

Tensor Reordering for CNN Compression

We show how parameter redundancy in Convolutional Neural Network (CNN) f...
research
02/03/2019

MICIK: MIning Cross-Layer Inherent Similarity Knowledge for Deep Model Compression

State-of-the-art deep model compression methods exploit the low-rank app...

Please sign up or login with your details

Forgot password? Click here to reset