Recognizing Images with at most one Spike per Neuron

12/30/2019
by   Christoph Stöckl, et al.
0

In order to port the performance of trained artificial neural networks (ANNs) to spiking neural networks (SNNs), which can be implemented in neuromorphic hardware with a drastically reduced energy consumption, an efficient ANN to SNN conversion is needed. Previous conversion schemes focused on the representation of the analog output of a rectified linear (ReLU) gate in the ANN by the firing rate of a spiking neuron. But this is not possible for other commonly used ANN gates, and it reduces the throughput even for ReLU gates. We introduce a new conversion method where a gate in the ANN, which can basically be of any type, is emulated by a small circuit of spiking neurons, with At Most One Spike (AMOS) per neuron. We show that this AMOS conversion improves the accuracy of SNNs for ImageNet from 74.60 the best available ANN accuracy (85.0 95.82 addition, AMOS conversion improves latency and throughput of spike-based image classification by several orders of magnitude. Hence these results suggest that SNNs provide a viable direction for developing highly energy efficient hardware for AI that combines high performance with versatility of applications.

READ FULL TEXT
research
01/31/2020

Classifying Images with Few Spikes per Neuron

Spiking neural networks (SNNs) promise to provide AI implementations wit...
research
06/14/2023

Are training trajectories of deep single-spike and deep ReLU network equivalent?

Communication by binary and sparse spikes is a key factor for the energy...
research
02/25/2020

RMP-SNNs: Residual Membrane Potential Neuron for Enabling Deeper High-Accuracy and Low-Latency Spiking Neural Networks

Spiking Neural Networks (SNNs) have recently attracted significant resea...
research
08/11/2020

TCL: an ANN-to-SNN Conversion with Trainable Clipping Layers

Spiking Neural Networks (SNNs) provide significantly lower power dissipa...
research
08/09/2022

A Time-to-first-spike Coding and Conversion Aware Training for Energy-Efficient Deep Spiking Neural Network Processor Design

In this paper, we present an energy-efficient SNN architecture, which ca...
research
02/21/2023

Bridging the Gap between ANNs and SNNs by Calibrating Offset Spikes

Spiking Neural Networks (SNNs) have attracted great attention due to the...
research
05/05/2020

Constructing Accurate and Efficient Deep Spiking Neural Networks with Double-threshold and Augmented Schemes

Spiking neural networks (SNNs) are considered as a potential candidate t...

Please sign up or login with your details

Forgot password? Click here to reset