Loss shaping enhances exact gradient learning with EventProp in Spiking Neural Networks

12/02/2022
by   Thomas Nowotny, et al.
0

In a recent paper Wunderlich and Pehle introduced the EventProp algorithm that enables training spiking neural networks by gradient descent on exact gradients. In this paper we present extensions of EventProp to support a wider class of loss functions and an implementation in the GPU enhanced neuronal networks framework which exploits sparsity. The GPU acceleration allows us to test EventProp extensively on more challenging learning benchmarks. We find that EventProp performs well on some tasks but for others there are issues where learning is slow or fails entirely. Here, we analyse these issues in detail and discover that they relate to the use of the exact gradient of the loss function, which by its nature does not provide information about loss changes due to spike creation or spike deletion. Depending on the details of the task and loss function, descending the exact gradient with EventProp can lead to the deletion of important spikes and so to an inadvertent increase of the loss and decrease of classification accuracy and hence a failure to learn. In other situations the lack of knowledge about the benefits of creating additional spikes can lead to a lack of gradient flow into earlier layers, slowing down learning. We eventually present a first glimpse of a solution to these problems in the form of `loss shaping', where we introduce a suitable weighting function into an integral loss to increase gradient flow from the output layer towards earlier layers.

READ FULL TEXT

page 7

page 9

research
03/02/2020

Explicitly Trained Spiking Sparsity in Spiking Neural Networks with Backpropagation

Spiking Neural Networks (SNNs) are being explored for their potential en...
research
06/14/2017

Gradient Descent for Spiking Neural Networks

Much of studies on neural computation are based on network models of sta...
research
10/18/2022

Exact Gradient Computation for Spiking Neural Networks Through Forward Propagation

Spiking neural networks (SNN) have recently emerged as alternatives to t...
research
12/15/2022

Exact Error Backpropagation Through Spikes for Precise Training of Spiking Neural Networks

Event-based simulations of Spiking Neural Networks (SNNs) are fast and a...
research
01/01/2020

Exploring Adversarial Attack in Spiking Neural Networks with Spike-Compatible Gradient

Recently, backpropagation through time inspired learning algorithms are ...
research
04/16/2021

Controlled abstention neural networks for identifying skillful predictions for classification problems

The earth system is exceedingly complex and often chaotic in nature, mak...
research
09/24/2019

Monotonic Trends in Deep Neural Networks

The importance of domain knowledge in enhancing model performance and ma...

Please sign up or login with your details

Forgot password? Click here to reset