Pruning untrained neural networks: Principles and Analysis

02/19/2020
by   Soufiane Hayou, et al.
21

Overparameterized neural networks display state-of-the art performance. However, there is a growing need for smaller, energy-efficient, neural networks to be able to use machine learning applications on devices with limited computational resources. A popular approach consists of using pruning techniques. While these techniques have traditionally focused on pruning pre-trained neural networks (e.g. LeCun et al. (1990) and Hassabi et al. (1993)), recent work by Lee et al. (2018) showed promising results where pruning is performed at initialization. However, such procedures remain unsatisfactory as the resulting pruned networks can be difficult to train and, for instance, these procedures do not prevent one layer being fully pruned. In this paper we provide a comprehensive theoretical analysis of pruning at initialization and training sparse architectures. This analysis allows us to propose novel principled approaches which we validate experimentally on a variety of network architectures. We particularly show that we can prune up to 99.9

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 9

09/18/2020

Pruning Neural Networks at Initialization: Why are We Missing the Mark?

Recent work has explored the possibility of pruning neural networks at i...
02/25/2019

The State of Sparsity in Deep Neural Networks

We rigorously evaluate three state-of-the-art techniques for inducing sp...
10/07/2019

Energy-Aware Neural Architecture Optimization with Fast Splitting Steepest Descent

Designing energy-efficient networks is of critical importance for enabli...
11/30/2020

Deconstructing the Structure of Sparse Neural Networks

Although sparse neural networks have been studied extensively, the focus...
04/06/2021

Point classification with Runge-Kutta networks and feature space augmentation

In this paper we combine an approach based on Runge-Kutta Nets considere...
07/31/2021

Provably Efficient Lottery Ticket Discovery

The lottery ticket hypothesis (LTH) claims that randomly-initialized, de...
07/05/2021

Connectivity Matters: Neural Network Pruning Through the Lens of Effective Sparsity

Neural network pruning is a fruitful area of research with surging inter...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.