The Incredible Shrinking Neural Network: New Perspectives on Learning Representations Through The Lens of Pruning

01/16/2017
by   Aditya Sharma, et al.
0

How much can pruning algorithms teach us about the fundamentals of learning representations in neural networks? And how much can these fundamentals help while devising new pruning techniques? A lot, it turns out. Neural network pruning has become a topic of great interest in recent years, and many different techniques have been proposed to address this problem. The decision of what to prune and when to prune necessarily forces us to confront our assumptions about how neural networks actually learn to represent patterns in data. In this work, we set out to test several long-held hypotheses about neural network learning representations, approaches to pruning and the relevance of one in the context of the other. To accomplish this, we argue in favor of pruning whole neurons as opposed to the traditional method of pruning weights from optimally trained networks. We first review the historical literature, point out some common assumptions it makes, and propose methods to demonstrate the inherent flaws in these assumptions. We then propose our novel approach to pruning and set about analyzing the quality of the decisions it makes. Our analysis led us to question the validity of many widely-held assumptions behind pruning algorithms and the trade-offs we often make in the interest of reducing computational complexity. We discovered that there is a straightforward way, however expensive, to serially prune 40-70 in a trained network with minimal effect on the learning representation and without any re-training. It is to be noted here that the motivation behind this work is not to propose an algorithm that would outperform all existing methods, but to shed light on what some inherent flaws in these methods can teach us about learning representations and how this can lead us to superior pruning techniques.

READ FULL TEXT
research
06/07/2022

Recall Distortion in Neural Network Pruning and the Undecayed Pruning Algorithm

Pruning techniques have been successfully used in neural networks to tra...
research
11/18/2021

Automatic Neural Network Pruning that Efficiently Preserves the Model Accuracy

Neural networks performance has been significantly improved in the last ...
research
05/11/2020

CupNet – Pruning a network for geometric data

Using data from a simulated cup drawing process, we demonstrate how the ...
research
11/13/2019

Selective Brain Damage: Measuring the Disparate Impact of Model Pruning

Neural network pruning techniques have demonstrated it is possible to re...
research
03/06/2020

What is the State of Neural Network Pruning?

Neural network pruning—the task of reducing the size of a network by rem...
research
04/03/2023

Self-building Neural Networks

During the first part of life, the brain develops while it learns throug...
research
11/08/2019

A different take on the best-first game tree pruning algorithms

The alpha-beta pruning algorithms have been popular in game tree searchi...

Please sign up or login with your details

Forgot password? Click here to reset