Examining and Mitigating the Impact of Crossbar Non-idealities for Accurate Implementation of Sparse Deep Neural Networks

01/13/2022
by   Abhiroop Bhattacharjee, et al.
6

Recently several structured pruning techniques have been introduced for energy-efficient implementation of Deep Neural Networks (DNNs) with lesser number of crossbars. Although, these techniques have claimed to preserve the accuracy of the sparse DNNs on crossbars, none have studied the impact of the inexorable crossbar non-idealities on the actual performance of the pruned networks. To this end, we perform a comprehensive study to show how highly sparse DNNs, that result in significant crossbar-compression-rate, can lead to severe accuracy losses compared to unpruned DNNs mapped onto non-ideal crossbars. We perform experiments with multiple structured-pruning approaches (such as, C/F pruning, XCS and XRS) on VGG11 and VGG16 DNNs with benchmark datasets (CIFAR10 and CIFAR100). We propose two mitigation approaches - Crossbar column rearrangement and Weight-Constrained-Training (WCT) - that can be integrated with the crossbar-mapping of the sparse DNNs to minimize accuracy losses incurred by the pruned models. These help in mitigating non-idealities by increasing the proportion of low conductance synapses on crossbars, thereby improving their computational accuracies.

READ FULL TEXT

page 1

page 2

page 4

research
08/10/2018

Hierarchical Block Sparse Neural Networks

Sparse deep neural networks(DNNs) are efficient in both memory and compu...
research
11/09/2019

Hardware-aware Pruning of DNNs using LFSR-Generated Pseudo-Random Indices

Deep neural networks (DNNs) have been emerged as the state-of-the-art al...
research
08/26/2017

TraNNsformer: Neural Network Transformation for Memristive Crossbar based Neuromorphic System Design

Implementation of Neuromorphic Systems using post Complementary Metal-Ox...
research
11/03/2020

Parameter Efficient Deep Neural Networks with Bilinear Projections

Recent research on deep neural networks (DNNs) has primarily focused on ...
research
04/30/2019

RadiX-Net: Structured Sparse Matrices for Deep Neural Networks

The sizes of deep neural networks (DNNs) are rapidly outgrowing the capa...
research
08/06/2023

Iterative Magnitude Pruning as a Renormalisation Group: A Study in The Context of The Lottery Ticket Hypothesis

This thesis delves into the intricate world of Deep Neural Networks (DNN...
research
06/12/2019

Parameterized Structured Pruning for Deep Neural Networks

As a result of the growing size of Deep Neural Networks (DNNs), the gap ...

Please sign up or login with your details

Forgot password? Click here to reset