(Pen-) Ultimate DNN Pruning

06/06/2019
by   Marc Riera, et al.
0

DNN pruning reduces memory footprint and computational work of DNN-based solutions to improve performance and energy-efficiency. An effective pruning scheme should be able to systematically remove connections and/or neurons that are unnecessary or redundant, reducing the DNN size without any loss in accuracy. In this paper we show that prior pruning schemes require an extremely time-consuming iterative process that requires retraining the DNN many times to tune the pruning hyperparameters. We propose a DNN pruning scheme based on Principal Component Analysis and relative importance of each neuron's connection that automatically finds the optimized DNN in one shot without requiring hand-tuning of multiple parameters.

READ FULL TEXT
research
03/29/2017

Towards thinner convolutional neural networks through Gradually Global Pruning

Deep network pruning is an effective method to reduce the storage and co...
research
07/08/2022

SInGE: Sparsity via Integrated Gradients Estimation of Neuron Relevance

The leap in performance in state-of-the-art computer vision methods is a...
research
07/07/2021

Immunization of Pruning Attack in DNN Watermarking Using Constant Weight Code

To ensure protection of the intellectual property rights of DNN models, ...
research
03/12/2018

FeTa: A DCA Pruning Algorithm with Generalization Error Guarantees

Recent DNN pruning algorithms have succeeded in reducing the number of p...
research
05/21/2019

Revisiting hard thresholding for DNN pruning

The most common method for DNN pruning is hard thresholding of network w...
research
09/30/2021

RED++ : Data-Free Pruning of Deep Neural Networks via Input Splitting and Output Merging

Pruning Deep Neural Networks (DNNs) is a prominent field of study in the...
research
04/26/2023

Concept-Monitor: Understanding DNN training through individual neurons

In this work, we propose a general framework called Concept-Monitor to h...

Please sign up or login with your details

Forgot password? Click here to reset