SNN2ANN: A Fast and Memory-Efficient Training Framework for Spiking Neural Networks

06/19/2022
by   Jianxiong Tang, et al.
0

Spiking neural networks are efficient computation models for low-power environments. Spike-based BP algorithms and ANN-to-SNN (ANN2SNN) conversions are successful techniques for SNN training. Nevertheless, the spike-base BP training is slow and requires large memory costs. Though ANN2NN provides a low-cost way to train SNNs, it requires many inference steps to mimic the well-trained ANN for good performance. In this paper, we propose a SNN-to-ANN (SNN2ANN) framework to train the SNN in a fast and memory-efficient way. The SNN2ANN consists of 2 components: a) a weight sharing architecture between ANN and SNN and b) spiking mapping units. Firstly, the architecture trains the weight-sharing parameters on the ANN branch, resulting in fast training and low memory costs for SNN. Secondly, the spiking mapping units ensure that the activation values of the ANN are the spiking features. As a result, the classification error of the SNN can be optimized by training the ANN branch. Besides, we design an adaptive threshold adjustment (ATA) algorithm to address the noisy spike problem. Experiment results show that our SNN2ANN-based models perform well on the benchmark datasets (CIFAR10, CIFAR100, and Tiny-ImageNet). Moreover, the SNN2ANN can achieve comparable accuracy under 0.625x time steps, 0.377x training time, 0.27x GPU memory costs, and 0.33x spike activities of the Spike-based BP model.

READ FULL TEXT

page 1

page 2

page 3

page 4

page 10

research
05/14/2021

Efficient Spiking Neural Networks with Radix Encoding

Spiking neural networks (SNNs) have advantages in latency and energy eff...
research
11/10/2022

A noise based novel strategy for faster SNN training

Spiking neural networks (SNNs) are receiving increasing attention due to...
research
05/13/2023

Spiking Network Initialisation and Firing Rate Collapse

In recent years, newly developed methods to train spiking neural network...
research
03/31/2017

Noisy Softplus: an activation function that enables SNNs to be trained as ANNs

We extended the work of proposed activation function, Noisy Softplus, to...
research
11/18/2020

Temporal Surrogate Back-propagation for Spiking Neural Networks

Spiking neural networks (SNN) are usually more energy-efficient as compa...
research
03/02/2022

Rethinking Pretraining as a Bridge from ANNs to SNNs

Spiking neural networks (SNNs) are known as a typical kind of brain-insp...
research
07/27/2022

Text Classification in Memristor-based Spiking Neural Networks

Memristors, emerging non-volatile memory devices, have shown promising p...

Please sign up or login with your details

Forgot password? Click here to reset