The Combinatorial Brain Surgeon: Pruning Weights That Cancel One Another in Neural Networks

03/09/2022
by   Xin Yu, et al.
0

Neural networks tend to achieve better accuracy with training if they are larger – even if the resulting models are overparameterized. Nevertheless, carefully removing such excess parameters before, during, or after training may also produce models with similar or even improved accuracy. In many cases, that can be curiously achieved by heuristics as simple as removing a percentage of the weights with the smallest absolute value – even though magnitude is not a perfect proxy for weight relevance. With the premise that obtaining significantly better performance from pruning depends on accounting for the combined effect of removing multiple weights, we revisit one of the classic approaches for impact-based pruning: the Optimal Brain Surgeon (OBS). We propose a tractable heuristic for solving the combinatorial extension of OBS, in which we select weights for simultaneous removal, as well as a systematic update of the remaining weights. Our selection method outperforms other methods under high sparsity, and the weight update is advantageous even when combined with the other methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/19/2023

To prune or not to prune : A chaos-causality approach to principled pruning of dense neural networks

Reducing the size of a neural network (pruning) by removing weights with...
research
02/28/2023

Fast as CHITA: Neural Network Pruning with Combinatorial Optimization

The sheer size of modern neural networks makes model serving a serious c...
research
06/20/2023

A Simple and Effective Pruning Approach for Large Language Models

As their size increases, Large Languages Models (LLMs) are natural candi...
research
03/19/2021

Cascade Weight Shedding in Deep Neural Networks: Benefits and Pitfalls for Network Pruning

We report, for the first time, on the cascade weight shedding phenomenon...
research
12/07/2020

The Role of Regularization in Shaping Weight and Node Pruning Dependency and Dynamics

The pressing need to reduce the capacity of deep neural networks has sti...
research
11/13/2019

Selective Brain Damage: Measuring the Disparate Impact of Model Pruning

Neural network pruning techniques have demonstrated it is possible to re...
research
11/23/2020

Synthesis and Pruning as a Dynamic Compression Strategy for Efficient Deep Neural Networks

The brain is a highly reconfigurable machine capable of task-specific ad...

Please sign up or login with your details

Forgot password? Click here to reset