EXODUS: Stable and Efficient Training of Spiking Neural Networks

05/20/2022
by   Felix Christian Bauer, et al.
1

Spiking Neural Networks (SNNs) are gaining significant traction in machine learning tasks where energy-efficiency is of utmost importance. Training such networks using the state-of-the-art back-propagation through time (BPTT) is, however, very time-consuming. Previous work by Shrestha and Orchard [2018] employs an efficient GPU-accelerated back-propagation algorithm called SLAYER, which speeds up training considerably. SLAYER, however, does not take into account the neuron reset mechanism while computing the gradients, which we argue to be the source of numerical instability. To counteract this, SLAYER introduces a gradient scale hyperparameter across layers, which needs manual tuning. In this paper, (i) we modify SLAYER and design an algorithm called EXODUS, that accounts for the neuron reset mechanism and applies the Implicit Function Theorem (IFT) to calculate the correct gradients (equivalent to those computed by BPTT), (ii) we eliminate the need for ad-hoc scaling of gradients, thus, reducing the training complexity tremendously, (iii) we demonstrate, via computer simulations, that EXODUS is numerically stable and achieves a comparable or better performance than SLAYER especially in various tasks with SNNs that rely on temporal features. Our code is available at https://github.com/synsense/sinabs-exodus.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/18/2022

Exact Gradient Computation for Spiking Neural Networks Through Forward Propagation

Spiking neural networks (SNN) have recently emerged as alternatives to t...
research
02/01/2023

SPIDE: A Purely Spike-based Method for Training Feedback Spiking Neural Networks

Spiking neural networks (SNNs) with event-based computation are promisin...
research
06/04/2018

BindsNET: A machine learning-oriented spiking neural networks library in Python

The development of spiking neural network simulation software is a criti...
research
09/29/2021

Training Feedback Spiking Neural Networks by Implicit Differentiation on the Equilibrium State

Spiking neural networks (SNNs) are brain-inspired models that enable ene...
research
10/25/2022

GLIF: A Unified Gated Leaky Integrate-and-Fire Neuron for Spiking Neural Networks

Spiking Neural Networks (SNNs) have been studied over decades to incorpo...
research
09/05/2019

Minibatch Processing in Spiking Neural Networks

Spiking neural networks (SNNs) are a promising candidate for biologicall...
research
02/15/2022

Navigating Local Minima in Quantized Spiking Neural Networks

Spiking and Quantized Neural Networks (NNs) are becoming exceedingly imp...

Please sign up or login with your details

Forgot password? Click here to reset