A Feature-map Discriminant Perspective for Pruning Deep Neural Networks

05/28/2020
by   Zejiang Hou, et al.
0

Network pruning has become the de facto tool to accelerate deep neural networks for mobile and edge applications. Recently, feature-map discriminant based channel pruning has shown promising results, as it aligns well with the CNN objective of differentiating multiple classes and offers better interpretability of the pruning decision. However, existing discriminant-based methods are challenged by computation inefficiency, as there is a lack of theoretical guidance on quantifying the feature-map discriminant power. In this paper, we present a new mathematical formulation to accurately and efficiently quantify the feature-map discriminativeness, which gives rise to a novel criterion,Discriminant Information(DI). We analyze the theoretical property of DI, specifically the non-decreasing property, that makes DI a valid selection criterion. DI-based pruning removes channels with minimum influence to DI value, as they contain little information regarding to the discriminant power. The versatility of DI criterion also enables an intra-layer mixed precision quantization to further compress the network. Moreover, we propose a DI-based greedy pruning algorithm and structure distillation technique to automatically decide the pruned structure that satisfies certain resource budget, which is a common requirement in reality. Extensive experiments demonstratethe effectiveness of our method: our pruned ResNet50 on ImageNet achieves 44 reduction without any Top-1 accuracy loss compared to unpruned model

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/17/2023

Dynamic Structure Pruning for Compressing CNNs

Structure pruning is an effective method to compress and accelerate neur...
research
11/19/2016

Pruning Convolutional Neural Networks for Resource Efficient Inference

We propose a new formulation for pruning convolutional kernels in neural...
research
02/24/2020

HRank: Filter Pruning using High-Rank Feature Map

Neural network pruning offers a promising prospect to facilitate deployi...
research
05/23/2019

Network Pruning via Transformable Architecture Search

Network pruning reduces the computation costs of an over-parameterized n...
research
10/21/2021

CATRO: Channel Pruning via Class-Aware Trace Ratio Optimization

Deep convolutional neural networks are shown to be overkill with high pa...
research
11/06/2020

Channel Pruning via Multi-Criteria based on Weight Dependency

Channel pruning has demonstrated its effectiveness in compressing ConvNe...
research
10/21/2021

Class-Discriminative CNN Compression

Compressing convolutional neural networks (CNNs) by pruning and distilla...

Please sign up or login with your details

Forgot password? Click here to reset