Hybrid Macro/Micro Level Backpropagation for Training Deep Spiking Neural Networks

05/21/2018
by   Yingyezhe Jin, et al.
0

Spiking neural networks (SNNs) are positioned to enable spatio-temporal information processing and ultra-low power event-driven neuromorphic hardware. However, SNNs are yet to reach the same performances of conventional deep artificial neural networks (ANNs), a long-standing challenge due to complex dynamics and non-differentiable spike events encountered in training. The existing SNN error backpropagation (BP) methods are limited in terms of scalability, lack of proper handling of spiking discontinuities, and/or mismatch between the rate-coded loss function and computed gradient. We present a hybrid macro/micro level backpropagation algorithm (HM2-BP) for training multi-layer SNNs. The temporal effects are precisely captured by the proposed spike-train level post-synaptic potential (S-PSP) at the microscopic level. The rate-coded errors are defined at the macroscopic level, computed and back-propagated across both macroscopic and microscopic levels. Different from existing BP methods, HM2-BP directly computes the gradient of the rate-coded loss function w.r.t tunable parameters. We evaluate the proposed HM2-BP algorithm by training deep fully connected and convolutional SNNs based on the static MNIST [13] and dynamic neuromorphic N-MNIST [22] datasets. HM2-BP achieves an accuracy level of 99.49 respectively, outperforming the best reported performances obtained from the existing SNN BP algorithms. It also achieves competitive performances surpassing those of conventional deep learning models when dealing with asynchronous spiking streams.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/18/2019

Spike-Train Level Backpropagation for Training Deep Recurrent Spiking Neural Networks

Spiking neural networks (SNNs) are more biologically plausible than conv...
research
02/24/2020

Temporal Spike Sequence Learning via Backpropagation for Deep Spiking Neural Networks

Spiking neural networks (SNNs) are well suited for spatio-temporal learn...
research
07/20/2022

A Temporally and Spatially Local Spike-based Backpropagation Algorithm to Enable Training in Hardware

Spiking Neural Networks (SNNs) have emerged as a hardware efficient arch...
research
08/31/2021

Spike time displacement based error backpropagation in convolutional spiking neural networks

We recently proposed the STiDi-BP algorithm, which avoids backward recur...
research
05/02/2020

Comparing SNNs and RNNs on Neuromorphic Vision Datasets: Similarities and Differences

Neuromorphic data, recording frameless spike events, have attracted cons...
research
02/15/2019

Deep Spiking Neural Network with Spike Count based Learning Rule

Deep spiking neural networks (SNNs) support asynchronous event-driven co...
research
06/08/2017

Spatio-Temporal Backpropagation for Training High-performance Spiking Neural Networks

Compared with artificial neural networks (ANNs), spiking neural networks...

Please sign up or login with your details

Forgot password? Click here to reset