A Time-to-first-spike Coding and Conversion Aware Training for Energy-Efficient Deep Spiking Neural Network Processor Design

08/09/2022
by   Dongwoo Lew, et al.
0

In this paper, we present an energy-efficient SNN architecture, which can seamlessly run deep spiking neural networks (SNNs) with improved accuracy. First, we propose a conversion aware training (CAT) to reduce ANN-to-SNN conversion loss without hardware implementation overhead. In the proposed CAT, the activation function developed for simulating SNN during ANN training, is efficiently exploited to reduce the data representation error after conversion. Based on the CAT technique, we also present a time-to-first-spike coding that allows lightweight logarithmic computation by utilizing spike time information. The SNN processor design that supports the proposed techniques has been implemented using 28nm CMOS process. The processor achieves the top-1 accuracies of 91.7 and 1426uJ to process CIFAR-10, CIFAR-100, and Tiny-ImageNet, respectively, when running VGG-16 with 5bit logarithmic weights.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/02/2019

A Tandem Learning Rule for Efficient and Rapid Inference on Deep Spiking Neural Networks

Emerging neuromorphic computing (NC) architectures have shown compelling...
research
03/26/2020

T2FSNN: Deep Spiking Neural Networks with Time-to-first-spike Coding

Spiking neural networks (SNNs) have gained considerable interest due to ...
research
03/01/2021

A Little Energy Goes a Long Way: Energy-Efficient, Accurate Conversion from Convolutional Neural Networks to Spiking Neural Networks

Spiking neural networks (SNNs) offer an inherent ability to process spat...
research
12/30/2019

Recognizing Images with at most one Spike per Neuron

In order to port the performance of trained artificial neural networks (...
research
09/10/2019

Boosting Throughput and Efficiency of Hardware Spiking Neural Accelerators using Time Compression Supporting Multiple Spike Codes

Spiking neural networks (SNNs) are the third generation of neural networ...
research
06/25/2020

From webtables to datatables

Webtables – tables and table-like structures on webpages – are excellent...
research
02/21/2023

Bridging the Gap between ANNs and SNNs by Calibrating Offset Spikes

Spiking Neural Networks (SNNs) have attracted great attention due to the...

Please sign up or login with your details

Forgot password? Click here to reset