Recall Distortion in Neural Network Pruning and the Undecayed Pruning Algorithm

06/07/2022
by   Aidan Good, et al.
0

Pruning techniques have been successfully used in neural networks to trade accuracy for sparsity. However, the impact of network pruning is not uniform: prior work has shown that the recall for underrepresented classes in a dataset may be more negatively affected. In this work, we study such relative distortions in recall by hypothesizing an intensification effect that is inherent to the model. Namely, that pruning makes recall relatively worse for a class with recall below accuracy and, conversely, that it makes recall relatively better for a class with recall above accuracy. In addition, we propose a new pruning algorithm aimed at attenuating such effect. Through statistical analysis, we have observed that intensification is less severe with our algorithm but nevertheless more pronounced with relatively more difficult tasks, less complex models, and higher pruning ratios. More surprisingly, we conversely observe a de-intensification effect with lower pruning ratios.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/02/2022

Cyclical Pruning for Sparse Neural Networks

Current methods for pruning neural network weights iteratively apply mag...
research
01/16/2017

The Incredible Shrinking Neural Network: New Perspectives on Learning Representations Through The Lens of Pruning

How much can pruning algorithms teach us about the fundamentals of learn...
research
11/13/2019

Selective Brain Damage: Measuring the Disparate Impact of Model Pruning

Neural network pruning techniques have demonstrated it is possible to re...
research
07/21/2023

3D Skeletonization of Complex Grapevines for Robotic Pruning

Robotic pruning of dormant grapevines is an area of active research in o...
research
08/04/2023

Pruning a neural network using Bayesian inference

Neural network pruning is a highly effective technique aimed at reducing...
research
03/19/2023

Two Kinds of Recall

It is an established assumption that pattern-based models are good at pr...
research
04/30/2020

Pruning artificial neural networks: a way to find well-generalizing, high-entropy sharp minima

Recently, a race towards the simplification of deep networks has begun, ...

Please sign up or login with your details

Forgot password? Click here to reset