NTK-SAP: Improving neural network pruning by aligning training dynamics

04/06/2023
by   Yite Wang, et al.
0

Pruning neural networks before training has received increasing interest due to its potential to reduce training time and memory. One popular method is to prune the connections based on a certain metric, but it is not entirely clear what metric is the best choice. Recent advances in neural tangent kernel (NTK) theory suggest that the training dynamics of large enough neural networks is closely related to the spectrum of the NTK. Motivated by this finding, we propose to prune the connections that have the least influence on the spectrum of the NTK. This method can help maintain the NTK spectrum, which may help align the training dynamics to that of its dense counterpart. However, one possible issue is that the fixed-weight-NTK corresponding to a given initial point can be very different from the NTK corresponding to later iterates during the training phase. We further propose to sample multiple realizations of random weights to estimate the NTK spectrum. Note that our approach is weight-agnostic, which is different from most existing methods that are weight-dependent. In addition, we use random inputs to compute the fixed-weight-NTK, making our method data-agnostic as well. We name our foresight pruning algorithm Neural Tangent Kernel Spectrum-Aware Pruning (NTK-SAP). Empirically, our method achieves better performance than all baselines on multiple datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/18/2023

Neural Network Pruning as Spectrum Preserving Process

Neural networks have achieved remarkable performance in various applicat...
research
03/19/2021

Cascade Weight Shedding in Deep Neural Networks: Benefits and Pitfalls for Network Pruning

We report, for the first time, on the cascade weight shedding phenomenon...
research
11/19/2019

CUP: Cluster Pruning for Compressing Deep Neural Networks

We propose Cluster Pruning (CUP) for compressing and accelerating deep n...
research
03/26/2023

Task-oriented Memory-efficient Pruning-Adapter

The Outstanding performance and growing size of Large Language Models ha...
research
01/26/2021

A Unified Paths Perspective for Pruning at Initialization

A number of recent approaches have been proposed for pruning neural netw...
research
11/30/2019

One-Shot Pruning of Recurrent Neural Networks by Jacobian Spectrum Evaluation

Recent advances in the sparse neural network literature have made it pos...
research
10/22/2020

Position-Agnostic Multi-Microphone Speech Dereverberation

Neural networks (NNs) have been widely applied in speech processing task...

Please sign up or login with your details

Forgot password? Click here to reset