Exact Gradient Computation for Spiking Neural Networks Through Forward Propagation

10/18/2022
by   Jane H Lee, et al.
0

Spiking neural networks (SNN) have recently emerged as alternatives to traditional neural networks, owing to energy efficiency benefits and capacity to better capture biological neuronal mechanisms. However, the classic backpropagation algorithm for training traditional networks has been notoriously difficult to apply to SNN due to the hard-thresholding and discontinuities at spike times. Therefore, a large majority of prior work believes exact gradients for SNN w.r.t. their weights do not exist and has focused on approximation methods to produce surrogate gradients. In this paper, (1) by applying the implicit function theorem to SNN at the discrete spike times, we prove that, albeit being non-differentiable in time, SNNs have well-defined gradients w.r.t. their weights, and (2) we propose a novel training algorithm, called forward propagation (FP), that computes exact gradients for SNN. FP exploits the causality structure between the spikes and allows us to parallelize computation forward in time. It can be used with other algorithms that simulate the forward pass, and it also provides insights on why other related algorithms such as Hebbian learning and also recently-proposed surrogate gradient methods may perform well.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/17/2020

EventProp: Backpropagation for Exact Gradients in Spiking Neural Networks

We derive the backpropagation algorithm for spiking neural networks comp...
research
05/20/2022

EXODUS: Stable and Efficient Training of Spiking Neural Networks

Spiking Neural Networks (SNNs) are gaining significant traction in machi...
research
07/02/2019

A Hybrid Learning Rule for Efficient and Rapid Inference with Spiking Neural Networks

The emerging neuromorphic computing (NC) architectures have shown compel...
research
12/02/2022

Loss shaping enhances exact gradient learning with EventProp in Spiking Neural Networks

In a recent paper Wunderlich and Pehle introduced the EventProp algorith...
research
03/02/2020

Explicitly Trained Spiking Sparsity in Spiking Neural Networks with Backpropagation

Spiking Neural Networks (SNNs) are being explored for their potential en...
research
01/01/2020

Exploring Adversarial Attack in Spiking Neural Networks with Spike-Compatible Gradient

Recently, backpropagation through time inspired learning algorithms are ...
research
09/13/2022

Optimization without Backpropagation

Forward gradients have been recently introduced to bypass backpropagatio...

Please sign up or login with your details

Forgot password? Click here to reset