A Unified Framework for Soft Threshold Pruning

02/25/2023
by   Yanqi Chen, et al.
0

Soft threshold pruning is among the cutting-edge pruning methods with state-of-the-art performance. However, previous methods either perform aimless searching on the threshold scheduler or simply set the threshold trainable, lacking theoretical explanation from a unified perspective. In this work, we reformulate soft threshold pruning as an implicit optimization problem solved using the Iterative Shrinkage-Thresholding Algorithm (ISTA), a classic method from the fields of sparse recovery and compressed sensing. Under this theoretical framework, all threshold tuning strategies proposed in previous studies of soft threshold pruning are concluded as different styles of tuning L_1-regularization term. We further derive an optimal threshold scheduler through an in-depth study of threshold scheduling based on our framework. This scheduler keeps L_1-regularization coefficient stable, implying a time-invariant objective function from the perspective of optimization. In principle, the derived pruning algorithm could sparsify any mathematical model trained via SGD. We conduct extensive experiments and verify its state-of-the-art performance on both Artificial Neural Networks (ResNet-50 and MobileNet-V1) and Spiking Neural Networks (SEW ResNet-18) on ImageNet datasets. On the basis of this framework, we derive a family of pruning methods, including sparsify-during-training, early pruning, and pruning at initialization. The code is available at https://github.com/Yanqi-Chen/LATS.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/22/2019

Towards Optimal Structured CNN Pruning via Generative Adversarial Learning

Structured pruning of filters or neurons has received increased focus fo...
research
05/07/2020

DMCP: Differentiable Markov Channel Pruning for Neural Networks

Recent works imply that the channel pruning can be regarded as searching...
research
03/02/2021

Network Pruning via Resource Reallocation

Channel pruning is broadly recognized as an effective approach to obtain...
research
05/30/2022

Gator: Customizable Channel Pruning of Neural Networks with Gating

The rise of neural network (NN) applications has prompted an increased i...
research
07/04/2022

Lottery Ticket Hypothesis for Spiking Neural Networks

Spiking Neural Networks (SNNs) have recently emerged as a new generation...
research
09/24/2020

A Gradient Flow Framework For Analyzing Network Pruning

Recent network pruning methods focus on pruning models early-on in train...
research
01/23/2020

Filter Sketch for Network Pruning

In this paper, we propose a novel network pruning approach by informatio...

Please sign up or login with your details

Forgot password? Click here to reset