Towards Optimal Structured CNN Pruning via Generative Adversarial Learning

03/22/2019
by   Shaohui Lin, et al.
0

Structured pruning of filters or neurons has received increased focus for compressing convolutional neural networks. Most existing methods rely on multi-stage optimizations in a layer-wise manner for iteratively pruning and retraining which may not be optimal and may be computation intensive. Besides, these methods are designed for pruning a specific structure, such as filter or block structures without jointly pruning heterogeneous structures. In this paper, we propose an effective structured pruning approach that jointly prunes filters as well as other structures in an end-to-end manner. To accomplish this, we first introduce a soft mask to scale the output of these structures by defining a new objective function with sparsity regularization to align the output of baseline and network with this mask. We then effectively solve the optimization problem by generative adversarial learning (GAL), which learns a sparse soft mask in a label-free and an end-to-end manner. By forcing more scaling factors in the soft mask to zero, the fast iterative shrinkage-thresholding algorithm (FISTA) can be leveraged to fast and reliably remove the corresponding structures. Extensive experiments demonstrate the effectiveness of GAL on different datasets, including MNIST, CIFAR-10 and ImageNet ILSVRC 2012. For example, on ImageNet ILSVRC 2012, the pruned ResNet-50 achieves 10.88% Top-5 error and results in a factor of 3.7x speedup. This significantly outperforms state-of-the-art methods.

READ FULL TEXT
research
09/18/2019

Gate Decorator: Global Filter Pruning Method for Accelerating Deep Convolutional Neural Networks

Filter pruning is one of the most effective ways to accelerate and compr...
research
02/25/2023

A Unified Framework for Soft Threshold Pruning

Soft threshold pruning is among the cutting-edge pruning methods with st...
research
07/05/2017

Data-Driven Sparse Structure Selection for Deep Neural Networks

Deep convolutional neural networks have liberated its extraordinary powe...
research
01/23/2019

Towards Compact ConvNets via Structure-Sparsity Regularized Filter Pruning

The success of convolutional neural networks (CNNs) in computer vision a...
research
07/01/2023

Filter Pruning for Efficient CNNs via Knowledge-driven Differential Filter Sampler

Filter pruning simultaneously accelerates the computation and reduces th...
research
10/15/2021

Fire Together Wire Together: A Dynamic Pruning Approach with Self-Supervised Mask Prediction

Dynamic model pruning is a recent direction that allows for the inferenc...
research
04/13/2022

Receding Neuron Importances for Structured Pruning

Structured pruning efficiently compresses networks by identifying and re...

Please sign up or login with your details

Forgot password? Click here to reset