Interpretations Steered Network Pruning via Amortized Inferred Saliency Maps

09/07/2022
by   Alireza Ganjdanesh, et al.
15

Convolutional Neural Networks (CNNs) compression is crucial to deploying these models in edge devices with limited resources. Existing channel pruning algorithms for CNNs have achieved plenty of success on complex models. They approach the pruning problem from various perspectives and use different metrics to guide the pruning process. However, these metrics mainly focus on the model's `outputs' or `weights' and neglect its `interpretations' information. To fill in this gap, we propose to address the channel pruning problem from a novel perspective by leveraging the interpretations of a model to steer the pruning process, thereby utilizing information from both inputs and outputs of the model. However, existing interpretation methods cannot get deployed to achieve our goal as either they are inefficient for pruning or may predict non-coherent explanations. We tackle this challenge by introducing a selector model that predicts real-time smooth saliency masks for pruned models. We parameterize the distribution of explanatory masks by Radial Basis Function (RBF)-like functions to incorporate geometric prior of natural images in our selector model's inductive bias. Thus, we can obtain compact representations of explanations to reduce the computational costs of our pruning method. We leverage our selector model to steer the network pruning by maximizing the similarity of explanatory representations for the pruned and original models. Extensive experiments on CIFAR-10 and ImageNet benchmark datasets demonstrate the efficacy of our proposed method. Our implementations are available at <https://github.com/Alii-Ganjj/InterpretationsSteeredPruning>

READ FULL TEXT

page 30

page 31

page 32

page 33

page 34

page 35

page 36

page 37

research
03/04/2023

Visual Saliency-Guided Channel Pruning for Deep Visual Detectors in Autonomous Driving

Deep neural network (DNN) pruning has become a de facto component for de...
research
09/21/2020

Conditional Automated Channel Pruning for Deep Neural Networks

Model compression aims to reduce the redundancy of deep networks to obta...
research
10/20/2019

Self-Adaptive Network Pruning

Deep convolutional neural networks have been proved successful on a wide...
research
04/03/2020

Composition of Saliency Metrics for Channel Pruning with a Myopic Oracle

The computation and memory needed for Convolutional Neural Network (CNN)...
research
05/04/2022

Domino Saliency Metrics: Improving Existing Channel Saliency Metrics with Structural Information

Channel pruning is used to reduce the number of weights in a Convolution...
research
05/21/2020

Feature Statistics Guided Efficient Filter Pruning

Building compact convolutional neural networks (CNNs) with reliable perf...
research
04/28/2019

LeGR: Filter Pruning via Learned Global Ranking

Filter pruning has shown to be effective for learning resource-constrain...

Please sign up or login with your details

Forgot password? Click here to reset