Greedy Optimization Provably Wins the Lottery: Logarithmic Number of Winning Tickets is Enough

10/29/2020
by   Mao Ye, et al.
2

Despite the great success of deep learning, recent works show that large deep neural networks are often highly redundant and can be significantly reduced in size. However, the theoretical question of how much we can prune a neural network given a specified tolerance of accuracy drop is still open. This paper provides one answer to this question by proposing a greedy optimization based pruning method. The proposed method has the guarantee that the discrepancy between the pruned network and the original network decays with exponentially fast rate w.r.t. the size of the pruned network, under weak assumptions that apply for most practical settings. Empirically, our method improves prior arts on pruning various network architectures including ResNet, MobilenetV2/V3 on ImageNet.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/03/2020

Good Subnetworks Provably Exist: Pruning via Greedy Forward Selection

Recent empirical works show that large deep neural networks are often hi...
research
02/22/2022

HRel: Filter Pruning based on High Relevance between Activation Maps and Class Labels

This paper proposes an Information Bottleneck theory based filter prunin...
research
04/30/2021

Post-training deep neural network pruning via layer-wise calibration

We present a post-training weight pruning method for deep neural network...
research
02/24/2022

Optimal channel selection with discrete QCQP

Reducing the high computational cost of large convolutional neural netwo...
research
12/09/2022

Optimizing Learning Rate Schedules for Iterative Pruning of Deep Neural Networks

The importance of learning rate (LR) schedules on network pruning has be...
research
10/11/2019

SiPPing Neural Networks: Sensitivity-informed Provable Pruning of Neural Networks

We introduce a pruning algorithm that provably sparsifies the parameters...
research
06/22/2020

Logarithmic Pruning is All You Need

The Lottery Ticket Hypothesis is a conjecture that every large neural ne...

Please sign up or login with your details

Forgot password? Click here to reset