A Free Lunch From ANN: Towards Efficient, Accurate Spiking Neural Networks Calibration

06/13/2021
by   Yuhang Li, et al.
0

Spiking Neural Network (SNN) has been recognized as one of the next generation of neural networks. Conventionally, SNN can be converted from a pre-trained ANN by only replacing the ReLU activation to spike activation while keeping the parameters intact. Perhaps surprisingly, in this work we show that a proper way to calibrate the parameters during the conversion of ANN to SNN can bring significant improvements. We introduce SNN Calibration, a cheap but extraordinarily effective method by leveraging the knowledge within a pre-trained Artificial Neural Network (ANN). Starting by analyzing the conversion error and its propagation through layers theoretically, we propose the calibration algorithm that can correct the error layer-by-layer. The calibration only takes a handful number of training data and several minutes to finish. Moreover, our calibration algorithm can produce SNN with state-of-the-art architecture on the large-scale ImageNet dataset, including MobileNet and RegNet. Extensive experiments demonstrate the effectiveness and efficiency of our algorithm. For example, our advanced pipeline can increase up to 69 baselines. Codes are released at https://github.com/yhhhli/SNN_Calibration.

READ FULL TEXT

page 1

page 2

page 3

page 4

05/25/2021

Optimal ANN-SNN Conversion for Fast and Accurate Inference in Deep Spiking Neural Networks

Spiking Neural Networks (SNNs), as bio-inspired energy-efficient neural ...
04/07/2021

PrivateSNN: Fully Privacy-Preserving Spiking Neural Networks

How can we bring both privacy and energy-efficiency to a neural system o...
10/26/2020

BayCANN: Streamlining Bayesian Calibration with Artificial Neural Network Metamodeling

Purpose: Bayesian calibration is theoretically superior to standard dire...
03/31/2017

Noisy Softplus: an activation function that enables SNNs to be trained as ANNs

We extended the work of proposed activation function, Noisy Softplus, to...
02/03/2022

Optimized Potential Initialization for Low-latency Spiking Neural Networks

Spiking Neural Networks (SNNs) have been attached great importance due t...
08/11/2020

TCL: an ANN-to-SNN Conversion with Trainable Clipping Layers

Spiking Neural Networks (SNNs) provide significantly lower power dissipa...
09/28/2021

Confusion-based rank similarity filters for computationally-efficient machine learning on high dimensional data

We introduce a novel type of computationally efficient artificial neural...