RicciNets: Curvature-guided Pruning of High-performance Neural Networks Using Ricci Flow

07/08/2020
by   Samuel Glass, et al.
0

A novel method to identify salient computational paths within randomly wired neural networks before training is proposed. The computational graph is pruned based on a node mass probability function defined by local graph measures and weighted by hyperparameters produced by a reinforcement learning-based controller neural network. We use the definition of Ricci curvature to remove edges of low importance before mapping the computational graph to a neural network. We show a reduction of almost 35% in the number of floating-point operations (FLOPs) per pass, with no degradation in performance. Further, our method can successfully regularize randomly wired neural networks based on purely structural properties, and also find that the favourable characteristics identified in one network generalise to other networks. The method produces networks with better performance under similar compression to those pruned by lowest-magnitude weights. To our best knowledge, this is the first work on pruning randomly wired neural networks, as well as the first to utilize the topological measure of Ricci curvature in the pruning mechanism.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/30/2021

Curvature Graph Neural Network

Graph neural networks (GNNs) have achieved great success in many graph-b...
research
10/17/2018

Pruning Deep Neural Networks using Partial Least Squares

To handle the high computational cost in deep convolutional networks, re...
research
06/30/2020

Training highly effective connectivities within neural networks with randomly initialized, fixed weights

We present some novel, straightforward methods for training the connecti...
research
07/11/2023

Memorization Through the Lens of Curvature of Loss Function Around Samples

Neural networks are overparametrized and easily overfit the datasets the...
research
10/22/2020

AutoPruning for Deep Neural Network with Dynamic Channel Masking

Modern deep neural network models are large and computationally intensiv...
research
03/08/2023

Loss-Curvature Matching for Dataset Selection and Condensation

Training neural networks on a large dataset requires substantial computa...
research
09/22/2020

Sanity-Checking Pruning Methods: Random Tickets can Win the Jackpot

Network pruning is a method for reducing test-time computational resourc...

Please sign up or login with your details

Forgot password? Click here to reset