Carrying out CNN Channel Pruning in a White Box

04/24/2021
by   Yuxin Zhang, et al.
8

Channel Pruning has been long adopted for compressing CNNs, which significantly reduces the overall computation. Prior works implement channel pruning in an unexplainable manner, which tends to reduce the final classification errors while failing to consider the internal influence of each channel. In this paper, we conduct channel pruning in a white box. Through deep visualization of feature maps activated by different channels, we observe that different channels have a varying contribution to different categories in image classification. Inspired by this, we choose to preserve channels contributing to most categories. Specifically, to model the contribution of each channel to differentiating categories, we develop a class-wise mask for each channel, implemented in a dynamic training manner w.r.t. the input image's category. On the basis of the learned class-wise mask, we perform a global voting mechanism to remove channels with less category discrimination. Lastly, a fine-tuning process is conducted to recover the performance of the pruned model. To our best knowledge, it is the first time that CNN interpretability theory is considered to guide channel pruning. Extensive experiments demonstrate the superiority of our White-Box over many state-of-the-arts. For instance, on CIFAR-10, it reduces 65.23 ResNet-110. On ILSVRC-2012, White-Box achieves a 45.6 only a small loss of 0.83 logs and pruned models are anonymously at https://github.com/zyxxmu/White-Box.

READ FULL TEXT
research
10/22/2021

Federated Unlearning via Class-Discriminative Pruning

We explore the problem of selectively forgetting categories from trained...
research
08/13/2023

Influence Function Based Second-Order Channel Pruning-Evaluating True Loss Changes For Pruning Is Possible Without Retraining

A challenge of channel pruning is designing efficient and effective crit...
research
02/27/2019

Multi-loss-aware Channel Pruning of Deep Networks

Channel pruning, which seeks to reduce the model size by removing redund...
research
01/23/2020

Channel Pruning via Automatic Structure Search

Channel pruning is among the predominant approaches to compress deep neu...
research
11/04/2022

Soft Masking for Cost-Constrained Channel Pruning

Structured channel pruning has been shown to significantly accelerate in...
research
04/29/2020

Rethinking Class-Discrimination Based CNN Channel Pruning

Channel pruning has received ever-increasing focus on network compressio...
research
10/30/2021

You are caught stealing my winning lottery ticket! Making a lottery ticket claim its ownership

Despite tremendous success in many application scenarios, the training a...

Please sign up or login with your details

Forgot password? Click here to reset