Going Deeper With Directly-Trained Larger Spiking Neural Networks

10/29/2020
by   Hanle Zheng, et al.
0

Spiking neural networks (SNNs) are promising in a bio-plausible coding for spatio-temporal information and event-driven signal processing, which is very suited for energy-efficient implementation in neuromorphic hardware. However, the unique working mode of SNNs makes them more difficult to train than traditional networks. Currently, there are two main routes to explore the training of deep SNNs with high performance. The first is to convert a pre-trained ANN model to its SNN version, which usually requires a long coding window for convergence and cannot exploit the spatio-temporal features during training for solving temporal tasks. The other is to directly train SNNs in the spatio-temporal domain. But due to the binary spike activity of the firing function and the problem of gradient vanishing or explosion, current methods are restricted to shallow architectures and thereby difficult in harnessing large-scale datasets (e.g. ImageNet). To this end, we propose a threshold-dependent batch normalization (tdBN) method based on the emerging spatio-temporal backpropagation, termed "STBP-tdBN", enabling direct training of a very deep SNN and the efficient implementation of its inference on neuromorphic hardware. With the proposed method and elaborated shortcut connection, we significantly extend directly-trained SNNs from a shallow structure ( < 10 layer) to a very deep structure (50 layers). Furthermore, we theoretically analyze the effectiveness of our method based on "Block Dynamical Isometry" theory. Finally, we report superior accuracy results including 93.15 timesteps. To our best knowledge, it's the first time to explore the directly-trained deep SNNs with high performance on ImageNet.

READ FULL TEXT
research
10/12/2022

Multi-Level Firing with Spiking DS-ResNet: Enabling Better and Deeper Directly-Trained Spiking Neural Networks

Spiking neural networks (SNNs) are bio-inspired neural networks with asy...
research
09/16/2018

Direct Training for Spiking Neural Networks: Faster, Larger, Better

Spiking neural networks (SNNs) are gaining more attention as a promising...
research
08/12/2023

Gated Attention Coding for Training High-performance and Efficient Spiking Neural Networks

Spiking neural networks (SNNs) are emerging as an energy-efficient alter...
research
06/08/2017

Spatio-Temporal Backpropagation for Training High-performance Spiking Neural Networks

Compared with artificial neural networks (ANNs), spiking neural networks...
research
11/19/2022

Spikeformer: A Novel Architecture for Training High-Performance Low-Latency Spiking Neural Network

Spiking neural networks (SNNs) have made great progress on both performa...
research
08/16/2023

Membrane Potential Batch Normalization for Spiking Neural Networks

As one of the energy-efficient alternatives of conventional neural netwo...
research
05/06/2021

PLSM: A Parallelized Liquid State Machine for Unintentional Action Detection

Reservoir Computing (RC) offers a viable option to deploy AI algorithms ...

Please sign up or login with your details

Forgot password? Click here to reset