Spiking Neural Networks with Improved Inherent Recurrence Dynamics for Sequential Learning

09/04/2021
by   Wachirawit Ponghiran, et al.
14

Spiking neural networks (SNNs) with leaky integrate and fire (LIF) neurons, can be operated in an event-driven manner and have internal states to retain information over time, providing opportunities for energy-efficient neuromorphic computing, especially on edge devices. Note, however, many representative works on SNNs do not fully demonstrate the usefulness of their inherent recurrence (membrane potentials retaining information about the past) for sequential learning. Most of the works train SNNs to recognize static images by artificially expanded input representation in time through rate coding. We show that SNNs can be trained for sequential tasks and propose modifications to a network of LIF neurons that enable internal states to learn long sequences and make their inherent recurrence resilient to the vanishing gradient problem. We then develop a training scheme to train the proposed SNNs with improved inherent recurrence dynamics. Our training scheme allows spiking neurons to produce multi-bit outputs (as opposed to binary spikes) which help mitigate the mismatch between a derivative of spiking neurons' activation function and a surrogate derivative used to overcome spiking neurons' non-differentiability. Our experimental results indicate that the proposed SNN architecture on TIMIT and LibriSpeech 100h dataset yields accuracy comparable to that of LSTMs (within 1.10 parameters than LSTMs. The sparse SNN outputs also lead to 10.13x and 11.14x savings in multiplication operations compared to GRUs, which is generally con-sidered as a lightweight alternative to LSTMs, on TIMIT and LibriSpeech 100h datasets, respectively.

READ FULL TEXT

page 1

page 3

page 4

page 6

page 9

page 11

page 12

page 14

research
06/26/2022

Gradient-based Neuromorphic Learning on Dynamical RRAM Arrays

We present MEMprop, the adoption of gradient-based learning to train ful...
research
02/25/2020

sBSNN: Stochastic-Bits Enabled Binary Spiking Neural Network with On-Chip Learning for Energy Efficient Neuromorphic Computing at the Edge

In this work, we propose stochastic Binary Spiking Neural Network (sBSNN...
research
06/22/2023

Accelerating SNN Training with Stochastic Parallelizable Spiking Neurons

Spiking neural networks (SNN) are able to learn spatiotemporal features ...
research
11/09/2019

Action Recognition Using Supervised Spiking Neural Networks

Biological neurons use spikes to process and learn temporally dynamic in...
research
10/23/2022

Towards Energy-Efficient, Low-Latency and Accurate Spiking LSTMs

Spiking Neural Networks (SNNs) have emerged as an attractive spatio-temp...
research
12/01/2022

Surrogate Gradient Spiking Neural Networks as Encoders for Large Vocabulary Continuous Speech Recognition

Compared to conventional artificial neurons that produce dense and real-...
research
08/21/2023

SpikingBERT: Distilling BERT to Train Spiking Language Models Using Implicit Differentiation

Large language Models (LLMs), though growing exceedingly powerful, compr...

Please sign up or login with your details

Forgot password? Click here to reset