A Temporally and Spatially Local Spike-based Backpropagation Algorithm to Enable Training in Hardware

07/20/2022
by   Anmol Biswas, et al.
0

Spiking Neural Networks (SNNs) have emerged as a hardware efficient architecture for classification tasks. The penalty of spikes-based encoding has been the lack of a universal training mechanism performed entirely using spikes. There have been several attempts to adopt the powerful backpropagation (BP) technique used in non-spiking artificial neural networks (ANN): (1) SNNs can be trained by externally computed numerical gradients. (2) A major advancement toward native spike-based learning has been the use of approximate Backpropagation using spike-time-dependent plasticity (STDP) with phased forward/backward passes. However, the transfer of information between such phases necessitates external memory and computational access. This is a challenge for neuromorphic hardware implementations. In this paper, we propose a stochastic SNN-based Back-Prop (SSNN-BP) algorithm that utilizes a composite neuron to simultaneously compute the forward pass activations and backward pass gradients explicitly with spikes. Although signed gradient values are a challenge for spike-based representation, we tackle this by splitting the gradient signal into positive and negative streams. The composite neuron encodes information in the form of stochastic spike-trains and converts Backpropagation weight updates into temporally and spatially local discrete STDP-like spike coincidence updates compatible with hardware-friendly Resistive Processing Units (RPUs). Furthermore, our method approaches BP ANN baseline with sufficiently long spike-trains. Finally, we show that softmax cross-entropy loss function can be implemented through inhibitory lateral connections enforcing a Winner Take All (WTA) rule. Our SNN shows excellent generalization through comparable performance to ANNs on the MNIST, Fashion-MNIST and Extended MNIST datasets. Thus, SSNN-BP enables BP compatible with purely spike-based neuromorphic hardware.

READ FULL TEXT

page 3

page 5

research
05/21/2018

Hybrid Macro/Micro Level Backpropagation for Training Deep Spiking Neural Networks

Spiking neural networks (SNNs) are positioned to enable spatio-temporal ...
research
02/24/2020

Temporal Spike Sequence Learning via Backpropagation for Deep Spiking Neural Networks

Spiking neural networks (SNNs) are well suited for spatio-temporal learn...
research
01/01/2020

Exploring Adversarial Attack in Spiking Neural Networks with Spike-Compatible Gradient

Recently, backpropagation through time inspired learning algorithms are ...
research
08/31/2021

Spike time displacement based error backpropagation in convolutional spiking neural networks

We recently proposed the STiDi-BP algorithm, which avoids backward recur...
research
06/15/2017

Hardware-efficient on-line learning through pipelined truncated-error backpropagation in binary-state networks

Artificial neural networks (ANNs) trained using backpropagation are powe...
research
03/30/2020

Critical Limits in a Bump Attractor Network of Spiking Neurons

A bump attractor network is a model that implements a competitive neuron...

Please sign up or login with your details

Forgot password? Click here to reset