Filter Pruning for Efficient CNNs via Knowledge-driven Differential Filter Sampler

07/01/2023
by   Shaohui Lin, et al.
0

Filter pruning simultaneously accelerates the computation and reduces the memory overhead of CNNs, which can be effectively applied to edge devices and cloud services. In this paper, we propose a novel Knowledge-driven Differential Filter Sampler (KDFS) with Masked Filter Modeling (MFM) framework for filter pruning, which globally prunes the redundant filters based on the prior knowledge of a pre-trained model in a differential and non-alternative optimization. Specifically, we design a differential sampler with learnable sampling parameters to build a binary mask vector for each layer, determining whether the corresponding filters are redundant. To learn the mask, we introduce masked filter modeling to construct PCA-like knowledge by aligning the intermediate features from the pre-trained teacher model and the outputs of the student decoder taking sampling features as the input. The mask and sampler are directly optimized by the Gumbel-Softmax Straight-Through Gradient Estimator in an end-to-end manner in combination with global pruning constraint, MFM reconstruction error, and dark knowledge. Extensive experiments demonstrate the proposed KDFS's effectiveness in compressing the base models on various datasets. For instance, the pruned ResNet-50 on ImageNet achieves 55.36% computation reduction, and 42.86% parameter reduction, while only dropping 0.35% Top-1 accuracy, significantly outperforming the state-of-the-art methods. The code is available at <https://github.com/Osilly/KDFS>.

READ FULL TEXT

page 1

page 9

research
01/23/2020

Filter Sketch for Network Pruning

In this paper, we propose a novel network pruning approach by informatio...
research
08/21/2018

Soft Filter Pruning for Accelerating Deep Convolutional Neural Networks

This paper proposed a Soft Filter Pruning (SFP) method to accelerate the...
research
06/25/2019

Importance Estimation for Neural Network Pruning

Structural pruning of neural network parameters reduces computation, ene...
research
03/22/2019

Towards Optimal Structured CNN Pruning via Generative Adversarial Learning

Structured pruning of filters or neurons has received increased focus fo...
research
04/28/2019

LeGR: Filter Pruning via Learned Global Ranking

Filter pruning has shown to be effective for learning resource-constrain...
research
07/14/2021

Training Compact CNNs for Image Classification using Dynamic-coded Filter Fusion

The mainstream approach for filter pruning is usually either to force a ...
research
07/16/2020

Training Interpretable Convolutional Neural Networks by Differentiating Class-specific Filters

Convolutional neural networks (CNNs) have been successfully used in a ra...

Please sign up or login with your details

Forgot password? Click here to reset