N2NSkip: Learning Highly Sparse Networks using Neuron-to-Neuron Skip Connections

08/07/2022
by   Arvind Subramaniam, et al.
0

The over-parametrized nature of Deep Neural Networks leads to considerable hindrances during deployment on low-end devices with time and space constraints. Network pruning strategies that sparsify DNNs using iterative prune-train schemes are often computationally expensive. As a result, techniques that prune at initialization, prior to training, have become increasingly popular. In this work, we propose neuron-to-neuron skip connections, which act as sparse weighted skip connections, to enhance the overall connectivity of pruned DNNs. Following a preliminary pruning step, N2NSkip connections are randomly added between individual neurons/channels of the pruned network, while maintaining the overall sparsity of the network. We demonstrate that introducing N2NSkip connections in pruned networks enables significantly superior performance, especially at high sparsity levels, as compared to pruned networks without N2NSkip connections. Additionally, we present a heat diffusion-based connectivity analysis to quantitatively determine the connectivity of the pruned network with respect to the reference network. We evaluate the efficacy of our approach on two different preliminary pruning methods which prune at initialization, and consistently obtain superior performance by exploiting the enhanced connectivity resulting from N2NSkip connections.

READ FULL TEXT
research
10/06/2020

RANP: Resource Aware Neuron Pruning at Initialization for 3D CNNs

Although 3D Convolutional Neural Networks (CNNs) are essential for most ...
research
02/21/2023

Structured Bayesian Compression for Deep Neural Networks Based on The Turbo-VBI Approach

With the growth of neural network size, model compression has attracted ...
research
10/04/2018

SNIP: Single-shot Network Pruning based on Connection Sensitivity

Pruning large neural networks while maintaining the performance is often...
research
01/26/2022

On The Energy Statistics of Feature Maps in Pruning of Neural Networks with Skip-Connections

We propose a new structured pruning framework for compressing Deep Neura...
research
06/16/2020

Progressive Skeletonization: Trimming more fat from a network at initialization

Recent studies have shown that skeletonization (pruning parameters) of n...
research
08/26/2017

TraNNsformer: Neural Network Transformation for Memristive Crossbar based Neuromorphic System Design

Implementation of Neuromorphic Systems using post Complementary Metal-Ox...
research
11/14/2017

Deep Rewiring: Training very sparse deep networks

Neuromorphic hardware tends to pose limits on the connectivity of deep n...

Please sign up or login with your details

Forgot password? Click here to reset