SATA: Sparsity-Aware Training Accelerator for Spiking Neural Networks

04/11/2022
by   Ruokai Yin, et al.
5

Spiking Neural Networks (SNNs) have gained huge attention as a potential energy-efficient alternative to conventional Artificial Neural Networks (ANNs) due to their inherent high-sparsity activation. Recently, SNNs with backpropagation through time (BPTT) have achieved a higher accuracy result on image recognition tasks compared to other SNN training algorithms. Despite the success on the algorithm perspective, prior works neglect the evaluation of the hardware energy overheads of BPTT, due to the lack of a hardware evaluation platform for SNN training algorithm design. Moreover, although SNNs have been long seen as an energy-efficient counterpart of ANNs, a quantitative comparison between the training cost of SNNs and ANNs is missing. To address the above-mentioned issues, in this work, we introduce SATA (Sparsity-Aware Training Accelerator), a BPTT-based training accelerator for SNNs. The proposed SATA provides a simple and re-configurable accelerator architecture for the general-purpose hardware evaluation platform, which makes it easier to analyze the training energy for SNN training algorithms. Based on SATA, we show quantitative analyses on the energy efficiency of SNN training and make a comparison between the training cost of SNNs and ANNs. The results show that SNNs consume 1.27× more total energy with considering sparsity (spikes, gradient of firing function, and gradient of membrane potential) when compared to ANNs. We find that such high training energy cost is from time-repetitive convolution operations and data movements during backpropagation. Moreover, to guide the future SNN training algorithm design, we provide several observations on energy efficiency with respect to different SNN-specific training parameters.

READ FULL TEXT

page 1

page 5

page 7

page 8

page 10

research
09/06/2023

Are SNNs Truly Energy-efficient? - A Hardware Perspective

Spiking Neural Networks (SNNs) have gained attention for their energy-ef...
research
05/02/2022

Sparse Compressed Spiking Neural Network Accelerator for Object Detection

Spiking neural networks (SNNs), which are inspired by the human brain, h...
research
06/25/2019

A Winograd-based Integrated Photonics Accelerator for Convolutional Neural Networks

Neural Networks (NNs) have become the mainstream technology in the artif...
research
11/25/2020

AccSS3D: Accelerator for Spatially Sparse 3D DNNs

Semantic understanding and completion of real world scenes is a foundati...
research
07/25/2021

H2Learn: High-Efficiency Learning Accelerator for High-Accuracy Spiking Neural Networks

Although spiking neural networks (SNNs) take benefits from the bio-plaus...
research
01/18/2022

FPGA-optimized Hardware acceleration for Spiking Neural Networks

Artificial intelligence (AI) is gaining success and importance in many d...
research
05/12/2021

High-Performance FPGA-based Accelerator for Bayesian Neural Networks

Neural networks (NNs) have demonstrated their potential in a wide range ...

Please sign up or login with your details

Forgot password? Click here to reset