TraNNsformer: Neural Network Transformation for Memristive Crossbar based Neuromorphic System Design

08/26/2017
by   Aayush Ankit, et al.
0

Implementation of Neuromorphic Systems using post Complementary Metal-Oxide-Semiconductor (CMOS) technology based Memristive Crossbar Array (MCA) has emerged as a promising solution to enable low-power acceleration of neural networks. However, the recent trend to design Deep Neural Networks (DNNs) for achieving human-like cognitive abilities poses significant challenges towards the scalable design of neuromorphic systems (due to the increase in computation/storage demands). Network pruning [7] is a powerful technique to remove redundant connections for designing optimally connected (maximally sparse) DNNs. However, such pruning techniques induce irregular connections that are incoherent to the crossbar structure. Eventually they produce DNNs with highly inefficient hardware realizations (in terms of area and energy). In this work, we propose TraNNsformer - an integrated training framework that transforms DNNs to enable their efficient realization on MCA-based systems. TraNNsformer first prunes the connectivity matrix while forming clusters with the remaining connections. Subsequently, it retrains the network to fine tune the connections and reinforce the clusters. This is done iteratively to transform the original connectivity into an optimally pruned and maximally clustered mapping. Without accuracy loss, TraNNsformer reduces the area (energy) consumption by 28 network. Compared to network pruning, TraNNsformer achieves 28 29 framework that allows mapping a given DNN to any MCA size permissible by the memristive technology for reliable operations.

READ FULL TEXT

page 6

page 7

research
08/29/2019

An Ultra-Efficient Memristor-Based DNN Framework with Structured Weight Pruning and Quantization Using ADMM

The high computation and memory storage of large deep neural networks (D...
research
01/13/2022

Examining and Mitigating the Impact of Crossbar Non-idealities for Accurate Implementation of Sparse Deep Neural Networks

Recently several structured pruning techniques have been introduced for ...
research
01/01/2019

MaD: Mapping and debugging framework for implementing deep neural network onto a neuromorphic chip with crossbar array of synapses

Neuromorphic systems or dedicated hardware for neuromorphic computing is...
research
09/19/2020

Enabling Resource-Aware Mapping of Spiking Neural Networks via Spatial Decomposition

With growing model complexity, mapping Spiking Neural Network (SNN)-base...
research
08/07/2022

N2NSkip: Learning Highly Sparse Networks using Neuron-to-Neuron Skip Connections

The over-parametrized nature of Deep Neural Networks leads to considerab...
research
02/11/2017

Group Scissor: Scaling Neuromorphic Computing Design to Large Neural Networks

Synapse crossbar is an elementary structure in Neuromorphic Computing Sy...
research
08/05/2015

INsight: A Neuromorphic Computing System for Evaluation of Large Neural Networks

Deep neural networks have been demonstrated impressive results in variou...

Please sign up or login with your details

Forgot password? Click here to reset