Magnitude and Uncertainty Pruning Criterion for Neural Networks

12/10/2019
by   Vinnie Ko, et al.
0

Neural networks have achieved dramatic improvements in recent years and depict the state-of-the-art methods for many real-world tasks nowadays. One drawback is, however, that many of these models are overparameterized, which makes them both computationally and memory intensive. Furthermore, overparameterization can also lead to undesired overfitting side-effects. Inspired by recently proposed magnitude-based pruning schemes and the Wald test from the field of statistics, we introduce a novel magnitude and uncertainty (M U) pruning criterion that helps to lessen such shortcomings. One important advantage of our M U pruning criterion is that it is scale-invariant, a phenomenon that the magnitude-based pruning criterion suffers from. In addition, we present a “pseudo bootstrap” scheme, which can efficiently estimate the uncertainty of the weights by using their update information during training. Our experimental evaluation, which is based on various neural network architectures and datasets, shows that our new criterion leads to more compressed models compared to models that are solely based on magnitude-based pruning criteria, with, at the same time, less loss in predictive power.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/12/2020

Lookahead: A Far-Sighted Alternative of Magnitude-based Pruning

Magnitude-based pruning is one of the simplest methods for pruning neura...
research
02/16/2023

WHC: Weighted Hybrid Criterion for Filter Pruning on Convolutional Neural Networks

Filter pruning has attracted increasing attention in recent years for it...
research
12/11/2021

CHAMP: Coherent Hardware-Aware Magnitude Pruning of Integrated Photonic Neural Networks

We propose a novel hardware-aware magnitude pruning technique for cohere...
research
05/30/2023

Generalization Bounds for Magnitude-Based Pruning via Sparse Matrix Sketching

In this paper, we derive a novel bound on the generalization error of Ma...
research
06/01/2020

Pruning via Iterative Ranking of Sensitivity Statistics

With the introduction of SNIP [arXiv:1810.02340v2], it has been demonstr...
research
10/17/2022

Principled Pruning of Bayesian Neural Networks through Variational Free Energy Minimization

Bayesian model reduction provides an efficient approach for comparing th...
research
03/02/2023

Average of Pruning: Improving Performance and Stability of Out-of-Distribution Detection

Detecting Out-of-distribution (OOD) inputs have been a critical issue fo...

Please sign up or login with your details

Forgot password? Click here to reset