Zeroth-Order Topological Insights into Iterative Magnitude Pruning

06/14/2022
by   Aishwarya Balwani, et al.
10

Modern-day neural networks are famously large, yet also highly redundant and compressible; there exist numerous pruning strategies in the deep learning literature that yield over 90 architectures while still maintaining their original accuracies. Amongst these many methods though – thanks to its conceptual simplicity, ease of implementation, and efficacy – Iterative Magnitude Pruning (IMP) dominates in practice and is the de facto baseline to beat in the pruning community. However, theoretical explanations as to why a simplistic method such as IMP works at all are few and limited. In this work, we leverage the notion of persistent homology to gain insights into the workings of IMP and show that it inherently encourages retention of those weights which preserve topological information in a trained network. Subsequently, we also provide bounds on how much different networks can be pruned while perfectly preserving their zeroth order topological features, and present a modified version of IMP to do the same.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/12/2020

Lookahead: A Far-Sighted Alternative of Magnitude-based Pruning

Magnitude-based pruning is one of the simplest methods for pruning neura...
research
01/14/2020

On Iterative Neural Network Pruning, Reinitialization, and the Similarity of Masks

We examine how recently documented, fundamental phenomena in deep learni...
research
06/16/2022

"Understanding Robustness Lottery": A Comparative Visual Analysis of Neural Network Pruning Approaches

Deep learning approaches have provided state-of-the-art performance in m...
research
07/07/2023

Distilled Pruning: Using Synthetic Data to Win the Lottery

This work introduces a novel approach to pruning deep learning models by...
research
08/06/2023

Iterative Magnitude Pruning as a Renormalisation Group: A Study in The Context of The Lottery Ticket Hypothesis

This thesis delves into the intricate world of Deep Neural Networks (DNN...
research
10/07/2021

Universality of Deep Neural Network Lottery Tickets: A Renormalization Group Perspective

Foundational work on the Lottery Ticket Hypothesis has suggested an exci...
research
07/16/2020

Lottery Tickets in Linear Models: An Analysis of Iterative Magnitude Pruning

We analyse the pruning procedure behind the lottery ticket hypothesis ar...

Please sign up or login with your details

Forgot password? Click here to reset