Event-based Backpropagation for Analog Neuromorphic Hardware

02/13/2023
by   Christian Pehle, et al.
0

Neuromorphic computing aims to incorporate lessons from studying biological nervous systems in the design of computer architectures. While existing approaches have successfully implemented aspects of those computational principles, such as sparse spike-based computation, event-based scalable learning has remained an elusive goal in large-scale systems. However, only then the potential energy-efficiency advantages of neuromorphic systems relative to other hardware architectures can be realized during learning. We present our progress implementing the EventProp algorithm using the example of the BrainScaleS-2 analog neuromorphic hardware. Previous gradient-based approaches to learning used "surrogate gradients" and dense sampling of observables or were limited by assumptions on the underlying dynamics and loss functions. In contrast, our approach only needs spike time observations from the system while being able to incorporate other system observables, such as membrane voltage measurements, in a principled way. This leads to a one-order-of-magnitude improvement in the information efficiency of the gradient estimate, which would directly translate to corresponding energy efficiency improvements in an optimized hardware implementation. We present the theoretical framework for estimating gradients and results verifying the correctness of the estimation, as well as results on a low-dimensional classification task using the BrainScaleS-2 system. Building on this work has the potential to enable scalable gradient estimation in large-scale neuromorphic hardware as a continuous measurement of the system state would be prohibitive and energy-inefficient in such instances. It also suggests the feasibility of a full on-device implementation of the algorithm that would enable scalable, energy-efficient, event-based learning in large-scale analog neuromorphic hardware.

READ FULL TEXT
research
12/24/2019

Fast and deep neuromorphic learning with time-to-first-spike coding

For a biological agent operating under environmental pressure, energy co...
research
03/18/2021

A deep learning theory for neural networks grounded in physics

In the last decade, deep learning has become a major component of artifi...
research
06/12/2020

Training spiking multi-layer networks with surrogate gradients on an analog neuromorphic substrate

Spiking neural networks are nature's solution for parallel information p...
research
04/20/2016

CLAASIC: a Cortex-Inspired Hardware Accelerator

This work explores the feasibility of specialized hardware implementing ...
research
10/11/2019

On-chip Few-shot Learning with Surrogate Gradient Descent on a Neuromorphic Processor

Recent work suggests that synaptic plasticity dynamics in biological mod...
research
03/28/2023

Simulation-based Inference for Model Parameterization on Analog Neuromorphic Hardware

The BrainScaleS-2 (BSS-2) system implements physical models of neurons a...
research
09/12/2016

FALCON: Feature Driven Selective Classification for Energy-Efficient Image Recognition

Machine-learning algorithms have shown outstanding image recognition or ...

Please sign up or login with your details

Forgot password? Click here to reset