Exploiting Channel Similarity for Accelerating Deep Convolutional Neural Networks

08/06/2019
by   Yunxiang Zhang, et al.
5

To address the limitations of existing magnitude-based pruning algorithms in cases where model weights or activations are of large and similar magnitude, we propose a novel perspective to discover parameter redundancy among channels and accelerate deep CNNs via channel pruning. Precisely, we argue that channels revealing similar feature information have functional overlap and that most channels within each such similarity group can be removed without compromising model's representational power. After deriving an effective metric for evaluating channel similarity through probabilistic modeling, we introduce a pruning algorithm via hierarchical clustering of channels. In particular, the proposed algorithm does not rely on sparsity training techniques or complex data-driven optimization and can be directly applied to pre-trained models. Extensive experiments on benchmark datasets strongly demonstrate the superior acceleration performance of our approach over prior arts. On ImageNet, our pruned ResNet-50 with 30

READ FULL TEXT

page 7

page 8

page 14

research
02/23/2020

Gradual Channel Pruning while Training using Feature Relevance Scores for Convolutional Neural Networks

The enormous inference cost of deep neural networks can be scaled down b...
research
10/28/2018

Discrimination-aware Channel Pruning for Deep Neural Networks

Channel pruning is one of the predominant approaches for deep model comp...
research
02/10/2021

CIFS: Improving Adversarial Robustness of CNNs via Channel-wise Importance-based Feature Selection

We investigate the adversarial robustness of CNNs from the perspective o...
research
05/22/2020

PruneNet: Channel Pruning via Global Importance

Channel pruning is one of the predominant approaches for accelerating de...
research
10/21/2021

CATRO: Channel Pruning via Class-Aware Trace Ratio Optimization

Deep convolutional neural networks are shown to be overkill with high pa...
research
01/04/2020

Discrimination-aware Network Pruning for Deep Model Compression

We study network pruning which aims to remove redundant channels/kernels...
research
11/14/2022

Pruning Very Deep Neural Network Channels for Efficient Inference

In this paper, we introduce a new channel pruning method to accelerate v...

Please sign up or login with your details

Forgot password? Click here to reset