RED++ : Data-Free Pruning of Deep Neural Networks via Input Splitting and Output Merging

09/30/2021
by   Edouard Yvinec, et al.
0

Pruning Deep Neural Networks (DNNs) is a prominent field of study in the goal of inference runtime acceleration. In this paper, we introduce a novel data-free pruning protocol RED++. Only requiring a trained neural network, and not specific to DNN architecture, we exploit an adaptive data-free scalar hashing which exhibits redundancies among neuron weight values. We study the theoretical and empirical guarantees on the preservation of the accuracy from the hashing as well as the expected pruning ratio resulting from the exploitation of said redundancies. We propose a novel data-free pruning technique of DNN layers which removes the input-wise redundant operations. This algorithm is straightforward, parallelizable and offers novel perspective on DNN pruning by shifting the burden of large computation to efficient memory access and allocation. We provide theoretical guarantees on RED++ performance and empirically demonstrate its superiority over other data-free pruning methods and its competitiveness with data-driven ones on ResNets, MobileNets and EfficientNets.

READ FULL TEXT
research
05/31/2021

RED : Looking for Redundancies for Data-Free Structured Compression of Deep Neural Networks

Deep Neural Networks (DNNs) are ubiquitous in today's computer vision la...
research
08/21/2023

Efficient Joint Optimization of Layer-Adaptive Weight Pruning in Deep Neural Networks

In this paper, we propose a novel layer-adaptive weight-pruning approach...
research
07/29/2022

A One-Shot Reparameterization Method for Reducing the Loss of Tile Pruning on DNNs

Recently, tile pruning has been widely studied to accelerate the inferen...
research
03/12/2018

FeTa: A DCA Pruning Algorithm with Generalization Error Guarantees

Recent DNN pruning algorithms have succeeded in reducing the number of p...
research
06/06/2019

(Pen-) Ultimate DNN Pruning

DNN pruning reduces memory footprint and computational work of DNN-based...
research
08/05/2022

Data-free Backdoor Removal based on Channel Lipschitzness

Recent studies have shown that Deep Neural Networks (DNNs) are vulnerabl...
research
09/18/2022

Pruning Neural Networks via Coresets and Convex Geometry: Towards No Assumptions

Pruning is one of the predominant approaches for compressing deep neural...

Please sign up or login with your details

Forgot password? Click here to reset