SPIDE: A Purely Spike-based Method for Training Feedback Spiking Neural Networks

02/01/2023
by   Mingqing Xiao, et al.
0

Spiking neural networks (SNNs) with event-based computation are promising brain-inspired models for energy-efficient applications on neuromorphic hardware. However, most supervised SNN training methods, such as conversion from artificial neural networks or direct training with surrogate gradients, require complex computation rather than spike-based operations of spiking neurons during training. In this paper, we study spike-based implicit differentiation on the equilibrium state (SPIDE) that extends the recently proposed training method, implicit differentiation on the equilibrium state (IDE), for supervised learning with purely spike-based computation, which demonstrates the potential for energy-efficient training of SNNs. Specifically, we introduce ternary spiking neuron couples and prove that implicit differentiation can be solved by spikes based on this design, so the whole training procedure, including both forward and backward passes, is made as event-driven spike computation, and weights are updated locally with two-stage average firing rates. Then we propose to modify the reset membrane potential to reduce the approximation error of spikes. With these key components, we can train SNNs with flexible structures in a small number of time steps and with firing sparsity during training, and the theoretical estimation of energy costs demonstrates the potential for high efficiency. Meanwhile, experiments show that even with these constraints, our trained models can still achieve competitive results on MNIST, CIFAR-10, CIFAR-100, and CIFAR10-DVS. Our code is available at https://github.com/pkuxmq/SPIDE-FSNN.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/29/2021

Training Feedback Spiking Neural Networks by Implicit Differentiation on the Equilibrium State

Spiking neural networks (SNNs) are brain-inspired models that enable ene...
research
01/28/2022

The fine line between dead neurons and sparsity in binarized spiking neural networks

Spiking neural networks can compensate for quantization error by encodin...
research
02/02/2023

Energy Efficient Training of SNN using Local Zeroth Order Method

Spiking neural networks are becoming increasingly popular for their low ...
research
05/01/2022

Training High-Performance Low-Latency Spiking Neural Networks by Differentiation on Spike Representation

Spiking Neural Network (SNN) is a promising energy-efficient AI model wh...
research
07/04/2023

Spike-driven Transformer

Spiking Neural Networks (SNNs) provide an energy-efficient deep learning...
research
05/20/2022

EXODUS: Stable and Efficient Training of Spiking Neural Networks

Spiking Neural Networks (SNNs) are gaining significant traction in machi...
research
10/09/2022

Online Training Through Time for Spiking Neural Networks

Spiking neural networks (SNNs) are promising brain-inspired energy-effic...

Please sign up or login with your details

Forgot password? Click here to reset