Fluctuation-driven initialization for spiking neural network training

06/21/2022
by   Julian Rossbroich, et al.
0

Spiking neural networks (SNNs) underlie low-power, fault-tolerant information processing in the brain and could constitute a power-efficient alternative to conventional deep neural networks when implemented on suitable neuromorphic hardware accelerators. However, instantiating SNNs that solve complex computational tasks in-silico remains a significant challenge. Surrogate gradient (SG) techniques have emerged as a standard solution for training SNNs end-to-end. Still, their success depends on synaptic weight initialization, similar to conventional artificial neural networks (ANNs). Yet, unlike in the case of ANNs, it remains elusive what constitutes a good initial state for an SNN. Here, we develop a general initialization strategy for SNNs inspired by the fluctuation-driven regime commonly observed in the brain. Specifically, we derive practical solutions for data-dependent weight initialization that ensure fluctuation-driven firing in the widely used leaky integrate-and-fire (LIF) neurons. We empirically show that SNNs initialized following our strategy exhibit superior learning performance when trained with SGs. These findings generalize across several datasets and SNN architectures, including fully connected, deep convolutional, recurrent, and more biologically plausible SNNs obeying Dale's law. Thus fluctuation-driven initialization provides a practical, versatile, and easy-to-implement strategy for improving SNN training performance on diverse tasks in neuromorphic engineering and computational neuroscience.

READ FULL TEXT

page 4

page 5

page 8

page 9

page 10

page 14

page 33

page 36

research
03/28/2019

Deep Convolutional Spiking Neural Networks for Image Classification

Spiking neural networks are biologically plausible counterparts of the a...
research
10/22/2020

Brain-Inspired Learning on Neuromorphic Substrates

Neuromorphic hardware strives to emulate brain-like neural networks and ...
research
11/24/2022

PC-SNN: Supervised Learning with Local Hebbian Synaptic Plasticity based on Predictive Coding in Spiking Neural Networks

Deemed as the third generation of neural networks, the event-driven Spik...
research
11/08/2016

An Efficient Approach to Boosting Performance of Deep Spiking Network Training

Nowadays deep learning is dominating the field of machine learning with ...
research
12/13/2021

Improving Surrogate Gradient Learning in Spiking Neural Networks via Regularization and Normalization

Spiking neural networks (SNNs) are different from the classical networks...
research
03/20/2019

Efficient Reward-Based Structural Plasticity on a SpiNNaker 2 Prototype

Advances in neuroscience uncover the mechanisms employed by the brain to...
research
05/07/2013

EURETILE 2010-2012 summary: first three years of activity of the European Reference Tiled Experiment

This is the summary of first three years of activity of the EURETILE FP7...

Please sign up or login with your details

Forgot password? Click here to reset