Neural Pruning via Growing Regularization

12/16/2020
by   Huan Wang, et al.
0

Regularization has long been utilized to learn sparsity in deep neural network pruning. However, its role is mainly explored in the small penalty strength regime. In this work, we extend its application to a new scenario where the regularization grows large gradually to tackle two central problems of pruning: pruning schedule and weight importance scoring. (1) The former topic is newly brought up in this work, which we find critical to the pruning performance while receives little research attention. Specifically, we propose an L2 regularization variant with rising penalty factors and show it can bring significant accuracy gains compared with its one-shot counterpart, even when the same weights are removed. (2) The growing penalty scheme also brings us an approach to exploit the Hessian information for more accurate pruning without knowing their specific values, thus not bothered by the common Hessian approximation problems. Empirically, the proposed algorithms are easy to implement and scalable to large datasets and networks in both structured and unstructured pruning. Their effectiveness is demonstrated with modern deep neural networks on the CIFAR and ImageNet datasets, achieving competitive results compared to many state-of-the-art algorithms. Our code and trained models are publicly available at https://github.com/mingsuntse/regularization-pruning.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/25/2022

Trainability Preserving Neural Structured Pruning

Several recent works empirically find finetuning learning rate is critic...
research
06/09/2023

How Sparse Can We Prune A Deep Network: A Geometric Viewpoint

Overparameterization constitutes one of the most significant hallmarks o...
research
03/01/2023

Structured Pruning for Deep Convolutional Neural Networks: A survey

The remarkable performance of deep Convolutional neural networks (CNNs) ...
research
11/20/2018

Structured Pruning for Efficient ConvNets via Incremental Regularization

Parameter pruning is a promising approach for CNN compression and accele...
research
06/19/2020

Exploring Weight Importance and Hessian Bias in Model Pruning

Model pruning is an essential procedure for building compact and computa...
research
06/28/2022

Deep Neural Networks pruning via the Structured Perspective Regularization

In Machine Learning, Artificial Neural Networks (ANNs) are a very powerf...
research
07/07/2020

Lossless CNN Channel Pruning via Gradient Resetting and Convolutional Re-parameterization

Channel pruning (a.k.a. filter pruning) aims to slim down a convolutiona...

Please sign up or login with your details

Forgot password? Click here to reset