Pruning with Compensation: Efficient Channel Pruning for Deep Convolutional Neural Networks

08/31/2021
by   Zhouyang Xie, et al.
15

Channel pruning is a promising technique to compress the parameters of deep convolutional neural networks(DCNN) and to speed up the inference. This paper aims to address the long-standing inefficiency of channel pruning. Most channel pruning methods recover the prediction accuracy by re-training the pruned model from the remaining parameters or random initialization. This re-training process is heavily dependent on the sufficiency of computational resources, training data, and human interference(tuning the training strategy). In this paper, a highly efficient pruning method is proposed to significantly reduce the cost of pruning DCNN. The main contributions of our method include: 1) pruning compensation, a fast and data-efficient substitute of re-training to minimize the post-pruning reconstruction loss of features, 2) compensation-aware pruning(CaP), a novel pruning algorithm to remove redundant or less-weighted channels by minimizing the loss of information, and 3) binary structural search with step constraint to minimize human interference. On benchmarks including CIFAR-10/100 and ImageNet, our method shows competitive pruning performance among the state-of-the-art retraining-based pruning methods and, more importantly, reduces the processing time by 95 90

READ FULL TEXT

page 1

page 11

research
01/18/2022

Pruning-aware Sparse Regularization for Network Pruning

Structural neural network pruning aims to remove the redundant channels ...
research
10/21/2021

CATRO: Channel Pruning via Class-Aware Trace Ratio Optimization

Deep convolutional neural networks are shown to be overkill with high pa...
research
02/11/2020

Training Efficient Network Architecture and Weights via Direct Sparsity Control

Artificial neural networks (ANNs) especially deep convolutional networks...
research
11/10/2020

Stage-wise Channel Pruning for Model Compression

Auto-ML pruning methods aim at searching a pruning strategy automaticall...
research
05/08/2021

EZCrop: Energy-Zoned Channels for Robust Output Pruning

Recent results have revealed an interesting observation in a trained con...
research
08/31/2020

Efficient and Sparse Neural Networks by Pruning Weights in a Multiobjective Learning Approach

Overparameterization and overfitting are common concerns when designing ...
research
10/12/2018

Dynamic Channel Pruning: Feature Boosting and Suppression

Making deep convolutional neural networks more accurate typically comes ...

Please sign up or login with your details

Forgot password? Click here to reset