Hybrid Macro/Micro Level Backpropagation for Training Deep Spiking Neural Networks
Spiking neural networks (SNNs) are positioned to enable spatio-temporal information processing and ultra-low power event-driven neuromorphic hardware. However, SNNs are yet to reach the same performances of conventional deep artificial neural networks (ANNs), a long-standing challenge due to complex dynamics and non-differentiable spike events encountered in training. The existing SNN error backpropagation (BP) methods are limited in terms of scalability, lack of proper handling of spiking discontinuities, and/or mismatch between the rate-coded loss function and computed gradient. We present a hybrid macro/micro level backpropagation algorithm (HM2-BP) for training multi-layer SNNs. The temporal effects are precisely captured by the proposed spike-train level post-synaptic potential (S-PSP) at the microscopic level. The rate-coded errors are defined at the macroscopic level, computed and back-propagated across both macroscopic and microscopic levels. Different from existing BP methods, HM2-BP directly computes the gradient of the rate-coded loss function w.r.t tunable parameters. We evaluate the proposed HM2-BP algorithm by training deep fully connected and convolutional SNNs based on the static MNIST [13] and dynamic neuromorphic N-MNIST [22] datasets. HM2-BP achieves an accuracy level of 99.49 respectively, outperforming the best reported performances obtained from the existing SNN BP algorithms. It also achieves competitive performances surpassing those of conventional deep learning models when dealing with asynchronous spiking streams.
READ FULL TEXT