Exploring Weight Importance and Hessian Bias in Model Pruning

06/19/2020
by   Mingchen Li, et al.
0

Model pruning is an essential procedure for building compact and computationally-efficient machine learning models. A key feature of a good pruning algorithm is that it accurately quantifies the relative importance of the model weights. While model pruning has a rich history, we still don't have a full grasp of the pruning mechanics even for relatively simple problems involving linear models or shallow neural nets. In this work, we provide a principled exploration of pruning by building on a natural notion of importance. For linear models, we show that this notion of importance is captured by covariance scaling which connects to the well-known Hessian-based pruning. We then derive asymptotic formulas that allow us to precisely compare the performance of different pruning methods. For neural networks, we demonstrate that the importance can be at odds with larger magnitudes and proper initialization is critical for magnitude-based pruning. Specifically, we identify settings in which weights become more important despite becoming smaller, which in turn leads to a catastrophic failure of magnitude-based pruning. Our results also elucidate that implicit regularization in the form of Hessian structure has a catalytic role in identifying the important weights, which dictate the pruning performance.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/02/2022

Cyclical Pruning for Sparse Neural Networks

Current methods for pruning neural network weights iteratively apply mag...
research
06/12/2023

Resource Efficient Neural Networks Using Hessian Based Pruning

Neural network pruning is a practical way for reducing the size of train...
research
07/16/2020

Lottery Tickets in Linear Models: An Analysis of Iterative Magnitude Pruning

We analyse the pruning procedure behind the lottery ticket hypothesis ar...
research
06/08/2023

Magnitude Attention-based Dynamic Pruning

Existing pruning methods utilize the importance of each weight based on ...
research
12/16/2020

Neural Pruning via Growing Regularization

Regularization has long been utilized to learn sparsity in deep neural n...
research
05/12/2021

Dynamical Isometry: The Missing Ingredient for Neural Network Pruning

Several recent works [40, 24] observed an interesting phenomenon in neur...
research
05/23/2019

Computationally Efficient Feature Significance and Importance for Machine Learning Models

We develop a simple and computationally efficient significance test for ...

Please sign up or login with your details

Forgot password? Click here to reset