On The Energy Statistics of Feature Maps in Pruning of Neural Networks with Skip-Connections

01/26/2022
by   Mohammadreza Soltani, et al.
0

We propose a new structured pruning framework for compressing Deep Neural Networks (DNNs) with skip connections, based on measuring the statistical dependency of hidden layers and predicted outputs. The dependence measure defined by the energy statistics of hidden layers serves as a model-free measure of information between the feature maps and the output of the network. The estimated dependence measure is subsequently used to prune a collection of redundant and uninformative layers. Model-freeness of our measure guarantees that no parametric assumptions on the feature map distribution are required, making it computationally appealing for very high dimensional feature space in DNNs. Extensive numerical experiments on various architectures show the efficacy of the proposed pruning approach with competitive performance to state-of-the-art methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/05/2021

Feature Flow Regularization: Improving Structured Sparsity in Deep Neural Networks

Pruning is a model compression method that removes redundant parameters ...
research
04/14/2018

Select, Attend, and Transfer: Light, Learnable Skip Connections

Skip connections in deep networks have improved both segmentation and cl...
research
06/05/2022

Searching Similarity Measure for Binarized Neural Networks

Being a promising model to be deployed in resource-limited devices, Bina...
research
11/02/2022

Backdoor Defense via Suppressing Model Shortcuts

Recent studies have demonstrated that deep neural networks (DNNs) are vu...
research
08/07/2022

N2NSkip: Learning Highly Sparse Networks using Neuron-to-Neuron Skip Connections

The over-parametrized nature of Deep Neural Networks leads to considerab...
research
06/24/2020

Feature-dependent Cross-Connections in Multi-Path Neural Networks

Learning a particular task from a dataset, samples in which originate fr...
research
02/21/2018

Building Efficient ConvNets using Redundant Feature Pruning

This paper presents an efficient technique to prune deep and/or wide con...

Please sign up or login with your details

Forgot password? Click here to reset