Training High-Performance Low-Latency Spiking Neural Networks by Differentiation on Spike Representation

05/01/2022
by   Qingyan Meng, et al.
0

Spiking Neural Network (SNN) is a promising energy-efficient AI model when implemented on neuromorphic hardware. However, it is a challenge to efficiently train SNNs due to their non-differentiability. Most existing methods either suffer from high latency (i.e., long simulation time steps), or cannot achieve as high performance as Artificial Neural Networks (ANNs). In this paper, we propose the Differentiation on Spike Representation (DSR) method, which could achieve high performance that is competitive to ANNs yet with low latency. First, we encode the spike trains into spike representation using (weighted) firing rate coding. Based on the spike representation, we systematically derive that the spiking dynamics with common neural models can be represented as some sub-differentiable mapping. With this viewpoint, our proposed DSR method trains SNNs through gradients of the mapping and avoids the common non-differentiability problem in SNN training. Then we analyze the error when representing the specific mapping with the forward computation of the SNN. To reduce such error, we propose to train the spike threshold in each layer, and to introduce a new hyperparameter for the neural models. With these components, the DSR method can achieve state-of-the-art SNN performance with low latency on both static and neuromorphic datasets, including CIFAR-10, CIFAR-100, ImageNet, and DVS-CIFAR10.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/09/2022

Online Training Through Time for Spiking Neural Networks

Spiking neural networks (SNNs) are promising brain-inspired energy-effic...
research
02/01/2023

SPIDE: A Purely Spike-based Method for Training Feedback Spiking Neural Networks

Spiking neural networks (SNNs) with event-based computation are promisin...
research
05/14/2021

Efficient Spiking Neural Networks with Radix Encoding

Spiking neural networks (SNNs) have advantages in latency and energy eff...
research
10/05/2020

Revisiting Batch Normalization for Training Low-latency Deep Spiking Neural Networks from Scratch

Spiking Neural Networks (SNNs) have recently emerged as an alternative t...
research
08/01/2023

Evaluating Spiking Neural Network On Neuromorphic Platform For Human Activity Recognition

Energy efficiency and low latency are crucial requirements for designing...
research
01/27/2023

Training Full Spike Neural Networks via Auxiliary Accumulation Pathway

Due to the binary spike signals making converting the traditional high-p...
research
05/09/2023

Spiking Neural Networks in the Alexiewicz Topology: A New Perspective on Analysis and Error Bounds

In order to ease the analysis of error propagation in neuromorphic compu...

Please sign up or login with your details

Forgot password? Click here to reset