Towards Scalable, Efficient and Accurate Deep Spiking Neural Networks with Backward Residual Connections, Stochastic Softmax and Hybridization

10/30/2019
by   Priyadarshini Panda, et al.
26

Spiking Neural Networks (SNNs) may offer an energy-efficient alternative for implementing deep learning applications. In recent years, there have been several proposals focused on supervised (conversion, spike-based gradient descent) and unsupervised (spike timing dependent plasticity) training methods to improve the accuracy of SNNs on large-scale tasks. However, each of these methods suffer from scalability, latency and accuracy limitations. In this paper, we propose novel algorithmic techniques of modifying the SNN configuration with backward residual connections, stochastic softmax and hybrid artificial-and-spiking neuronal activations to improve the learning ability of the training methodologies to yield competitive accuracy, while, yielding large efficiency gains over their artificial counterparts. Note, artificial counterparts refer to conventional deep learning/artificial neural networks. Our techniques apply to VGG/Residual architectures, and are compatible with all forms of training methodologies. Our analysis reveals that the proposed solutions yield near state-of-the-art accuracy with significant energy-efficiency and reduced parameter overhead translating to hardware improvements on complex visual recognition tasks, such as, CIFAR10, Imagenet datatsets.

READ FULL TEXT
research
04/24/2023

Spikingformer: Spike-driven Residual Learning for Transformer-based Spiking Neural Network

Spiking neural networks (SNNs) offer a promising energy-efficient altern...
research
03/23/2023

Skip Connections in Spiking Neural Networks: An Analysis of Their Effect on Network Training

Spiking neural networks (SNNs) have gained attention as a promising alte...
research
03/31/2018

Combining STDP and Reward-Modulated STDP in Deep Convolutional Spiking Neural Networks for Digit Recognition

The primate visual system has inspired the development of deep artificia...
research
05/04/2020

Enabling Deep Spiking Neural Networks with Hybrid Conversion and Spike Timing Dependent Backpropagation

Spiking Neural Networks (SNNs) operate with asynchronous discrete events...
research
03/01/2021

A Little Energy Goes a Long Way: Energy-Efficient, Accurate Conversion from Convolutional Neural Networks to Spiking Neural Networks

Spiking neural networks (SNNs) offer an inherent ability to process spat...
research
02/11/2019

ReStoCNet: Residual Stochastic Binary Convolutional Spiking Neural Network for Memory-Efficient Neuromorphic Computing

In this work, we propose ReStoCNet, a residual stochastic multilayer con...
research
11/24/2021

Softmax Gradient Tampering: Decoupling the Backward Pass for Improved Fitting

We introduce Softmax Gradient Tampering, a technique for modifying the g...

Please sign up or login with your details

Forgot password? Click here to reset