Improving Surrogate Gradient Learning in Spiking Neural Networks via Regularization and Normalization

12/13/2021
by   Nandan Meda, et al.
0

Spiking neural networks (SNNs) are different from the classical networks used in deep learning: the neurons communicate using electrical impulses called spikes, just like biological neurons. SNNs are appealing for AI technology, because they could be implemented on low power neuromorphic chips. However, SNNs generally remain less accurate than their analog counterparts. In this report, we examine various regularization and normalization techniques with the goal of improving surrogate gradient learning in SNNs.

READ FULL TEXT
research
01/28/2019

Surrogate Gradient Learning in Spiking Neural Networks

A growing number of neuromorphic spiking neural network processors that ...
research
01/24/2021

Encrypted Internet traffic classification using a supervised Spiking Neural Network

Internet traffic recognition is an essential tool for access providers s...
research
11/09/2019

Action Recognition Using Supervised Spiking Neural Networks

Biological neurons use spikes to process and learn temporally dynamic in...
research
09/28/2021

StereoSpike: Depth Learning with a Spiking Neural Network

Depth estimation is an important computer vision task, useful in particu...
research
07/29/2020

A superconducting nanowire spiking element for neural networks

As the limits of traditional von Neumann computing come into view, the b...
research
12/01/2022

Surrogate Gradient Spiking Neural Networks as Encoders for Large Vocabulary Continuous Speech Recognition

Compared to conventional artificial neurons that produce dense and real-...
research
06/21/2022

Fluctuation-driven initialization for spiking neural network training

Spiking neural networks (SNNs) underlie low-power, fault-tolerant inform...

Please sign up or login with your details

Forgot password? Click here to reset