Enabling Deep Spiking Neural Networks with Hybrid Conversion and Spike Timing Dependent Backpropagation

05/04/2020
by   Nitin Rathi, et al.
6

Spiking Neural Networks (SNNs) operate with asynchronous discrete events (or spikes) which can potentially lead to higher energy-efficiency in neuromorphic hardware implementations. Many works have shown that an SNN for inference can be formed by copying the weights from a trained Artificial Neural Network (ANN) and setting the firing threshold for each layer as the maximum input received in that layer. These type of converted SNNs require a large number of time steps to achieve competitive accuracy which diminishes the energy savings. The number of time steps can be reduced by training SNNs with spike-based backpropagation from scratch, but that is computationally expensive and slow. To address these challenges, we present a computationally-efficient training technique for deep SNNs. We propose a hybrid training methodology: 1) take a converted SNN and use its weights and thresholds as an initialization step for spike-based backpropagation, and 2) perform incremental spike-timing dependent backpropagation (STDB) on this carefully initialized network to obtain an SNN that converges within few epochs and requires fewer time steps for input processing. STDB is performed with a novel surrogate gradient function defined using neuron's spike time. The proposed training methodology converges in less than 20 epochs of spike-based backpropagation for most standard image classification datasets, thereby greatly reducing the training complexity compared to training SNNs from scratch. We perform experiments on CIFAR-10, CIFAR-100, and ImageNet datasets for both VGG and ResNet architectures. We achieve top-1 accuracy of 65.19 steps, which is 10X faster compared to converted SNNs with similar accuracy.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/05/2020

Revisiting Batch Normalization for Training Low-latency Deep Spiking Neural Networks from Scratch

Spiking Neural Networks (SNNs) have recently emerged as an alternative t...
research
07/02/2019

A Hybrid Learning Rule for Efficient and Rapid Inference with Spiking Neural Networks

The emerging neuromorphic computing (NC) architectures have shown compel...
research
08/09/2020

DIET-SNN: Direct Input Encoding With Leakage and Threshold Optimization in Deep Spiking Neural Networks

Bio-inspired spiking neural networks (SNNs), operating with asynchronous...
research
08/01/2023

Gradient Scaling on Deep Spiking Neural Networks with Spike-Dependent Local Information

Deep spiking neural networks (SNNs) are promising neural networks for th...
research
02/28/2023

Towards Memory- and Time-Efficient Backpropagation for Training Spiking Neural Networks

Spiking Neural Networks (SNNs) are promising energy-efficient models for...

Please sign up or login with your details

Forgot password? Click here to reset