Optimal ANN-SNN Conversion for Fast and Accurate Inference in Deep Spiking Neural Networks

05/25/2021
by   Jianhao Ding, et al.
0

Spiking Neural Networks (SNNs), as bio-inspired energy-efficient neural networks, have attracted great attentions from researchers and industry. The most efficient way to train deep SNNs is through ANN-SNN conversion. However, the conversion usually suffers from accuracy loss and long inference time, which impede the practical application of SNN. In this paper, we theoretically analyze ANN-SNN conversion and derive sufficient conditions of the optimal conversion. To better correlate ANN-SNN and get greater accuracy, we propose Rate Norm Layer to replace the ReLU activation function in source ANN training, enabling direct conversion from a trained ANN to an SNN. Moreover, we propose an optimal fit curve to quantify the fit between the activation value of source ANN and the actual firing rate of target SNN. We show that the inference time can be reduced by optimizing the upper bound of the fit curve in the revised ANN to achieve fast inference. Our theory can explain the existing work on fast reasoning and get better results. The experimental results show that the proposed method achieves near loss less conversion with VGG-16, PreActResNet-18, and deeper structures. Moreover, it can reach 8.6x faster reasoning performance under 0.265x energy consumption of the typical method. The code is available at https://github.com/DingJianhao/OptSNNConvertion-RNL-RIL.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/08/2023

Optimal ANN-SNN Conversion for High-accuracy and Ultra-low-latency Spiking Neural Networks

Spiking Neural Networks (SNNs) have gained great attraction due to their...
research
06/13/2021

A Free Lunch From ANN: Towards Efficient, Accurate Spiking Neural Networks Calibration

Spiking Neural Network (SNN) has been recognized as one of the next gene...
research
03/31/2017

Noisy Softplus: an activation function that enables SNNs to be trained as ANNs

We extended the work of proposed activation function, Noisy Softplus, to...
research
08/11/2020

TCL: an ANN-to-SNN Conversion with Trainable Clipping Layers

Spiking Neural Networks (SNNs) provide significantly lower power dissipa...
research
02/04/2023

Reducing ANN-SNN Conversion Error through Residual Membrane Potential

Spiking Neural Networks (SNNs) have received extensive academic attentio...
research
02/21/2023

Bridging the Gap between ANNs and SNNs by Calibrating Offset Spikes

Spiking Neural Networks (SNNs) have attracted great attention due to the...
research
03/01/2021

A Little Energy Goes a Long Way: Energy-Efficient, Accurate Conversion from Convolutional Neural Networks to Spiking Neural Networks

Spiking neural networks (SNNs) offer an inherent ability to process spat...

Please sign up or login with your details

Forgot password? Click here to reset