SpikeGrad: An ANN-equivalent Computation Model for Implementing Backpropagation with Spikes

Event-based neuromorphic systems promise to reduce the energy consumption of deep learning tasks by replacing expensive floating point operations on dense matrices by low power sparse and asynchronous operations on spike events. While these systems can be trained increasingly well using approximations of the back-propagation algorithm, these implementations usually require high precision errors for training and are therefore incompatible with the typical communication infrastructure of neuromorphic circuits. In this work, we analyze how the gradient can be discretized into spike events when training a spiking neural network. To accelerate our simulation, we show that using a special implementation of the integrate-and-fire neuron allows us to describe the accumulated activations and errors of the spiking neural network in terms of an equivalent artificial neural network, allowing us to largely speed up training compared to an explicit simulation of all spike events. This way we are able to demonstrate that even for deep networks, the gradients can be discretized sufficiently well with spikes if the gradient is properly rescaled. This form of spike-based backpropagation enables us to achieve equivalent or better accuracies on the MNIST and CIFAR10 dataset than comparable state-of-the-art spiking neural networks trained with full precision gradients. The algorithm, which we call SpikeGrad, is based on accumulation and comparison operations and can naturally exploit sparsity in the gradient computation, which makes it an interesting choice for a spiking neuromorphic systems with on-chip learning capacities.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/06/2023

Spike-based computation using classical recurrent neural networks

Spiking neural networks are a type of artificial neural networks in whic...
research
07/03/2018

Is Neuromorphic MNIST neuromorphic? Analyzing the discriminative power of neuromorphic datasets in the time domain

The advantage of spiking neural networks (SNNs) over their predecessors ...
research
05/30/2022

Accelerating spiking neural network training

Spiking neural networks (SNN) are a type of artificial network inspired ...
research
03/11/2019

A Spiking Network for Inference of Relations Trained with Neuromorphic Backpropagation

The increasing need for intelligent sensors in a wide range of everyday ...
research
10/26/2018

Whetstone: A Method for Training Deep Artificial Neural Networks for Binary Communication

This paper presents a new technique for training networks for low-precis...
research
11/19/2022

Intelligence Processing Units Accelerate Neuromorphic Learning

Spiking neural networks (SNNs) have achieved orders of magnitude improve...
research
11/02/2016

Deep counter networks for asynchronous event-based processing

Despite their advantages in terms of computational resources, latency, a...

Please sign up or login with your details

Forgot password? Click here to reset