Temporally Efficient Deep Learning with Spikes

06/13/2017
by   Peter O'Connor, et al.
0

The vast majority of natural sensory data is temporally redundant. Video frames or audio samples which are sampled at nearby points in time tend to have similar values. Typically, deep learning algorithms take no advantage of this redundancy to reduce computation. This can be an obscene waste of energy. We present a variant on backpropagation for neural networks in which computation scales with the rate of change of the data - not the rate at which we process the data. We do this by having neurons communicate a combination of their state, and their temporal change in state. Intriguingly, this simple communication rule give rise to units that resemble biologically-inspired leaky integrate-and-fire neurons, and to a weight-update rule that is equivalent to a form of Spike-Timing Dependent Plasticity (STDP), a synaptic learning rule observed in the brain. We demonstrate that on MNIST and a temporal variant of MNIST, our algorithm performs about as well as a Multilayer Perceptron trained with backpropagation, despite only communicating discrete values between layers.

READ FULL TEXT
research
11/12/2017

BP-STDP: Approximating Backpropagation using Spike Timing Dependent Plasticity

The problem of training spiking neural networks (SNNs) is a necessary pr...
research
07/27/2020

Supervised Learning in Temporally-Coded Spiking Neural Networks with Approximate Backpropagation

In this work we propose a new supervised learning method for temporally-...
research
07/08/2020

BS4NN: Binarized Spiking Neural Networks with Temporal Coding and Learning

We recently proposed the S4NN algorithm, essentially an adaptation of ba...
research
11/24/2020

A More Biologically Plausible Local Learning Rule for ANNs

The backpropagation algorithm is often debated for its biological plausi...
research
09/19/2015

STDP as presynaptic activity times rate of change of postsynaptic activity

We introduce a weight update formula that is expressed only in terms of ...
research
03/26/2023

Lazy learning: a biologically-inspired plasticity rule for fast and energy efficient synaptic plasticity

When training neural networks for classification tasks with backpropagat...
research
08/21/2018

Backpropagation and Biological Plausibility

By and large, Backpropagation (BP) is regarded as one of the most import...

Please sign up or login with your details

Forgot password? Click here to reset