Pruning Neural Networks via Coresets and Convex Geometry: Towards No Assumptions

09/18/2022
by   Murad Tukan, et al.
0

Pruning is one of the predominant approaches for compressing deep neural networks (DNNs). Lately, coresets (provable data summarizations) were leveraged for pruning DNNs, adding the advantage of theoretical guarantees on the trade-off between the compression rate and the approximation error. However, coresets in this domain were either data-dependent or generated under restrictive assumptions on both the model's weights and inputs. In real-world scenarios, such assumptions are rarely satisfied, limiting the applicability of coresets. To this end, we suggest a novel and robust framework for computing such coresets under mild assumptions on the model's weights and without any assumption on the training data. The idea is to compute the importance of each neuron in each layer with respect to the output of the following layer. This is achieved by a combination of Löwner ellipsoid and Caratheodory theorem. Our method is simultaneously data-independent, applicable to various networks and datasets (due to the simplified assumptions), and theoretically supported. Experimental results show that our method outperforms existing coreset based neural pruning approaches across a wide range of networks and datasets. For example, our method achieved a 62% compression rate on ResNet50 on ImageNet with 1.09% drop in accuracy.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/05/2018

DropPruning for Model Compression

Deep neural networks (DNNs) have dramatically achieved great success on ...
research
08/19/2020

Data-Independent Structured Pruning of Neural Networks via Coresets

Model compression is crucial for deployment of neural networks on device...
research
07/09/2019

On Activation Function Coresets for Network Pruning

Model compression provides a means to efficiently deploy deep neural net...
research
11/09/2019

Hardware-aware Pruning of DNNs using LFSR-Generated Pseudo-Random Indices

Deep neural networks (DNNs) have been emerged as the state-of-the-art al...
research
01/31/2022

SPDY: Accurate Pruning with Speedup Guarantees

The recent focus on the efficiency of deep neural networks (DNNs) has le...
research
09/30/2021

RED++ : Data-Free Pruning of Deep Neural Networks via Input Splitting and Output Merging

Pruning Deep Neural Networks (DNNs) is a prominent field of study in the...
research
02/25/2020

Exploring Learning Dynamics of DNNs via Layerwise Conditioning Analysis

Conditioning analysis uncovers the landscape of optimization objective b...

Please sign up or login with your details

Forgot password? Click here to reset