Brain-Inspired Learning on Neuromorphic Substrates

10/22/2020
by   Friedemann Zenke, et al.
0

Neuromorphic hardware strives to emulate brain-like neural networks and thus holds the promise for scalable, low-power information processing on temporal data streams. Yet, to solve real-world problems, these networks need to be trained. However, training on neuromorphic substrates creates significant challenges due to the offline character and the required non-local computations of gradient-based learning algorithms. This article provides a mathematical framework for the design of practical online learning algorithms for neuromorphic substrates. Specifically, we show a direct connection between Real-Time Recurrent Learning (RTRL), an online algorithm for computing gradients in conventional Recurrent Neural Networks (RNNs), and biologically plausible learning rules for training Spiking Neural Networks (SNNs). Further, we motivate a sparse approximation based on block-diagonal Jacobians, which reduces the algorithm's computational complexity, diminishes the non-local information requirements, and empirically leads to good learning performance, thereby improving its applicability to neuromorphic substrates. In summary, our framework bridges the gap between synaptic plasticity and gradient-based approaches from deep learning and lays the foundations for powerful information processing on future neuromorphic hardware systems.

READ FULL TEXT

page 1

page 7

page 9

research
10/27/2021

BioGrad: Biologically Plausible Gradient-Based Learning for Spiking Neural Networks

Spiking neural networks (SNN) are delivering energy-efficient, massively...
research
01/25/2019

Biologically inspired alternatives to backpropagation through time for learning in recurrent neural nets

The way how recurrently connected networks of spiking neurons in the bra...
research
07/27/2023

A Sparse Quantized Hopfield Network for Online-Continual Memory

An important difference between brains and deep neural networks is the w...
research
05/01/2021

Neko: a Library for Exploring Neuromorphic Learning Rules

The field of neuromorphic computing is in a period of active exploration...
research
02/24/2023

Flexible Phase Dynamics for Bio-Plausible Contrastive Learning

Many learning algorithms used as normative models in neuroscience or as ...
research
06/21/2022

Fluctuation-driven initialization for spiking neural network training

Spiking neural networks (SNNs) underlie low-power, fault-tolerant inform...
research
05/04/2023

SuperNeuro: A Fast and Scalable Simulator for Neuromorphic Computing

In many neuromorphic workflows, simulators play a vital role for importa...

Please sign up or login with your details

Forgot password? Click here to reset