Temporal coding in spiking neural networks with alpha synaptic function

07/30/2019
by   Iulia M. Comsa, et al.
15

The timing of individual neuronal spikes is essential for biological brains to make fast responses to sensory stimuli. However, conventional artificial neural networks lack the intrinsic temporal coding ability present in biological networks. We propose a spiking neural network model that encodes information in the relative timing of individual neuron spikes. In classification tasks, the output of the network is indicated by the first neuron to spike in the output layer. This temporal coding scheme allows the supervised training of the network with backpropagation, using locally exact derivatives of the postsynaptic spike times with respect to presynaptic spike times. The network operates using a biologically-plausible alpha synaptic transfer function. Additionally, we use trainable synchronisation pulses that provide bias, add flexibility during training and exploit the decay part of the alpha function. We show that such networks can be trained successfully on noisy Boolean logic tasks and on the MNIST dataset encoded in time. The results show that the spiking neural network outperforms comparable spiking models on MNIST and achieves similar quality to fully connected conventional networks with the same architecture. We also find that the spiking network spontaneously discovers two operating regimes, mirroring the accuracy-speed trade-off observed in human decision-making: a slow regime, where a decision is taken after all hidden neurons have spiked and the accuracy is very high, and a fast regime, where a decision is taken very fast but the accuracy is lower. These results demonstrate the computational power of spiking networks with biological characteristics that encode information in the timing of individual neurons. By studying temporal coding in spiking networks, we aim to create building blocks towards energy-efficient and more complex biologically-inspired neural architectures.

READ FULL TEXT

page 9

page 13

research
11/29/2022

Timing-Based Backpropagation in Spiking Neural Networks Without Single-Spike Restrictions

We propose a novel backpropagation algorithm for training spiking neural...
research
10/27/2021

Latent Equilibrium: A unified learning theory for arbitrarily fast computation with arbitrarily slow neurons

The response time of physical computational elements is finite, and neur...
research
08/23/2019

Spiking Neural Predictive Coding for Continual Learning from Data Streams

For energy-efficient computation in specialized neuromorphic hardware, w...
research
01/23/2019

Robust computation with rhythmic spike patterns

Information coding by precise timing of spikes can be faster and more en...
research
06/09/2020

Hardware Implementation of Spiking Neural Networks Using Time-To-First-Spike Encoding

Hardware-based spiking neural networks (SNNs) are regarded as promising ...
research
10/06/2021

Spike-inspired Rank Coding for Fast and Accurate Recurrent Neural Networks

Biological spiking neural networks (SNNs) can temporally encode informat...
research
12/16/2019

Network of Evolvable Neural Units: Evolving to Learn at a Synaptic Level

Although Deep Neural Networks have seen great success in recent years th...

Please sign up or login with your details

Forgot password? Click here to reset