Towards Energy-Efficient, Low-Latency and Accurate Spiking LSTMs

10/23/2022
by   Gourav datta, et al.
0

Spiking Neural Networks (SNNs) have emerged as an attractive spatio-temporal computing paradigm for complex vision tasks. However, most existing works yield models that require many time steps and do not leverage the inherent temporal dynamics of spiking neural networks, even for sequential tasks. Motivated by this observation, we propose an optimized spiking long short-term memory networks (LSTM) training framework that involves a novel ANN-to-SNN conversion framework, followed by SNN training. In particular, we propose novel activation functions in the source LSTM architecture and judiciously select a subset of them for conversion to integrate-and-fire (IF) activations with optimal bias shifts. Additionally, we derive the leaky-integrate-and-fire (LIF) activation functions converted from their non-spiking LSTM counterparts which justifies the need to jointly optimize the weights, threshold, and leak parameter. We also propose a pipelined parallel processing scheme which hides the SNN time steps, significantly improving system latency, especially for long sequences. The resulting SNNs have high activation sparsity and require only accumulate operations (AC), in contrast to expensive multiply-and-accumulates (MAC) needed for ANNs, except for the input layer when using direct encoding, yielding significant improvements in energy efficiency. We evaluate our framework on sequential learning tasks including temporal MNIST, Google Speech Commands (GSC), and UCI Smartphone datasets on different LSTM architectures. We obtain test accuracy of 94.75 the GSC dataset with 4.1x lower energy than an iso-architecture standard LSTM.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/22/2021

Can Deep Neural Networks be Converted to Ultra Low-Latency Spiking Neural Networks?

Spiking neural networks (SNNs), that operate via binary spikes distribut...
research
02/10/2020

A Spike in Performance: Training Hybrid-Spiking Neural Networks with Quantized Activation Functions

The machine learning community has become increasingly interested in the...
research
12/20/2022

Hoyer regularizer is all you need for ultra low-latency spiking neural networks

Spiking Neural networks (SNN) have emerged as an attractive spatio-tempo...
research
10/24/2022

SpikeSim: An end-to-end Compute-in-Memory Hardware Evaluation Tool for Benchmarking Spiking Neural Networks

SNNs are an active research domain towards energy efficient machine inte...
research
07/15/2021

Training for temporal sparsity in deep neural networks, application in video processing

Activation sparsity improves compute efficiency and resource utilization...
research
11/12/2020

LIAF-Net: Leaky Integrate and Analog Fire Network for Lightweight and Efficient Spatiotemporal Information Processing

Spiking neural networks (SNNs) based on Leaky Integrate and Fire (LIF) m...
research
09/04/2021

Spiking Neural Networks with Improved Inherent Recurrence Dynamics for Sequential Learning

Spiking neural networks (SNNs) with leaky integrate and fire (LIF) neuro...

Please sign up or login with your details

Forgot password? Click here to reset