Spike-inspired Rank Coding for Fast and Accurate Recurrent Neural Networks

by   Alan Jeffares, et al.

Biological spiking neural networks (SNNs) can temporally encode information in their outputs, e.g. in the rank order in which neurons fire, whereas artificial neural networks (ANNs) conventionally do not. As a result, models of SNNs for neuromorphic computing are regarded as potentially more rapid and efficient than ANNs when dealing with temporal input. On the other hand, ANNs are simpler to train, and usually achieve superior performance. Here we show that temporal coding such as rank coding (RC) inspired by SNNs can also be applied to conventional ANNs such as LSTMs, and leads to computational savings and speedups. In our RC for ANNs, we apply backpropagation through time using the standard real-valued activations, but only from a strategically early time step of each sequential input example, decided by a threshold-crossing event. Learning then incorporates naturally also _when_ to produce an output, without other changes to the model or the algorithm. Both the forward and the backward training pass can be significantly shortened by skipping the remaining input sequence after that first event. RC-training also significantly reduces time-to-insight during inference, with a minimal decrease in accuracy. The desired speed-accuracy trade-off is tunable by varying the threshold or a regularization parameter that rewards output entropy. We demonstrate these in two toy problems of sequence classification, and in a temporally-encoded MNIST dataset where our RC model achieves 99.19 time-step, outperforming the state of the art in temporal coding with SNNs, as well as in spoken-word classification of Google Speech Commands, outperforming non-RC-trained early inference with LSTMs.


BS4NN: Binarized Spiking Neural Networks with Temporal Coding and Learning

We recently proposed the S4NN algorithm, essentially an adaptation of ba...

Temporal coding in spiking neural networks with alpha synaptic function

The timing of individual neuronal spikes is essential for biological bra...

Hardware Implementation of Spiking Neural Networks Using Time-To-First-Spike Encoding

Hardware-based spiking neural networks (SNNs) are regarded as promising ...

Supervised learning based on temporal coding in spiking neural networks

Gradient descent training techniques are remarkably successful in traini...

DCT-SNN: Using DCT to Distribute Spatial Information over Time for Learning Low-Latency Spiking Neural Networks

Spiking Neural Networks (SNNs) offer a promising alternative to traditio...

Surrogate Gradient Spiking Neural Networks as Encoders for Large Vocabulary Continuous Speech Recognition

Compared to conventional artificial neurons that produce dense and real-...

An Inherent Trade-Off in Noisy Neural Communication with Rank-Order Coding

Rank-order coding, a form of temporal coding, has emerged as a promising...

Please sign up or login with your details

Forgot password? Click here to reset