RED : Looking for Redundancies for Data-Free Structured Compression of Deep Neural Networks

05/31/2021
by   Edouard Yvinec, et al.
0

Deep Neural Networks (DNNs) are ubiquitous in today's computer vision land-scape, despite involving considerable computational costs. The mainstream approaches for runtime acceleration consist in pruning connections (unstructured pruning) or, better, filters (structured pruning), both often requiring data to re-train the model. In this paper, we present RED, a data-free structured, unified approach to tackle structured pruning. First, we propose a novel adaptive hashing of the scalar DNN weight distribution densities to increase the number of identical neurons represented by their weight vectors. Second, we prune the network by merging redundant neurons based on their relative similarities, as defined by their distance. Third, we propose a novel uneven depthwise separation technique to further prune convolutional layers. We demonstrate through a large variety of benchmarks that RED largely outperforms other data-free pruning methods, often reaching performance similar to unconstrained, data-driven methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/30/2021

RED++ : Data-Free Pruning of Deep Neural Networks via Input Splitting and Output Merging

Pruning Deep Neural Networks (DNNs) is a prominent field of study in the...
research
04/12/2020

A Unified DNN Weight Compression Framework Using Reweighted Optimization Methods

To address the large model size and intensive computation requirement of...
research
03/01/2023

Structured Pruning for Deep Convolutional Neural Networks: A survey

The remarkable performance of deep Convolutional neural networks (CNNs) ...
research
07/25/2022

Trainability Preserving Neural Structured Pruning

Several recent works empirically find finetuning learning rate is critic...
research
12/02/2018

Network Compression via Recursive Bayesian Pruning

Recently, compression and acceleration of deep neural networks are in cr...
research
11/10/2020

Dirichlet Pruning for Neural Network Compression

We introduce Dirichlet pruning, a novel post-processing technique to tra...
research
06/22/2020

Slimming Neural Networks using Adaptive Connectivity Scores

There are two broad approaches to deep neural network (DNN) pruning: 1) ...

Please sign up or login with your details

Forgot password? Click here to reset