Log In Sign Up

LIAF-Net: Leaky Integrate and Analog Fire Network for Lightweight and Efficient Spatiotemporal Information Processing

by   Zhenzhi Wu, et al.

Spiking neural networks (SNNs) based on Leaky Integrate and Fire (LIF) model have been applied to energy-efficient temporal and spatiotemporal processing tasks. Thanks to the bio-plausible neuronal dynamics and simplicity, LIF-SNN benefits from event-driven processing, however, usually faces the embarrassment of reduced performance. This may because in LIF-SNN the neurons transmit information via spikes. To address this issue, in this work, we propose a Leaky Integrate and Analog Fire (LIAF) neuron model, so that analog values can be transmitted among neurons, and a deep network termed as LIAF-Net is built on it for efficient spatiotemporal processing. In the temporal domain, LIAF follows the traditional LIF dynamics to maintain its temporal processing capability. In the spatial domain, LIAF is able to integrate spatial information through convolutional integration or fully-connected integration. As a spatiotemporal layer, LIAF can also be used with traditional artificial neural network (ANN) layers jointly. Experiment results indicate that LIAF-Net achieves comparable performance to Gated Recurrent Unit (GRU) and Long short-term memory (LSTM) on bAbI Question Answering (QA) tasks, and achieves state-of-the-art performance on spatiotemporal Dynamic Vision Sensor (DVS) datasets, including MNIST-DVS, CIFAR10-DVS and DVS128 Gesture, with much less number of synaptic weights and computational overhead compared with traditional networks built by LSTM, GRU, Convolutional LSTM (ConvLSTM) or 3D convolution (Conv3D). Compared with traditional LIF-SNN, LIAF-Net also shows dramatic accuracy gain on all these experiments. In conclusion, LIAF-Net provides a framework combining the advantages of both ANNs and SNNs for lightweight and efficient spatiotemporal information processing.


page 1

page 4

page 11


Long short-term memory and Learning-to-learn in networks of spiking neurons

Networks of spiking neurons (SNNs) are frequently studied as models for ...

Event-based Video Reconstruction via Potential-assisted Spiking Neural Network

Neuromorphic vision sensor is a new bio-inspired imaging paradigm that r...

SIT: A Bionic and Non-Linear Neuron for Spiking Neural Network

Spiking Neural Networks (SNNs) have piqued researchers' interest because...

Towards Energy-Efficient, Low-Latency and Accurate Spiking LSTMs

Spiking Neural Networks (SNNs) have emerged as an attractive spatio-temp...

Deep Networks Incorporating Spiking Neural Dynamics

Neural networks have become the key technology of Artificial Intelligenc...

Comparing SNNs and RNNs on Neuromorphic Vision Datasets: Similarities and Differences

Neuromorphic data, recording frameless spike events, have attracted cons...

Deep Spatiotemporal Models for Robust Proprioceptive Terrain Classification

Terrain classification is a critical component of any autonomous mobile ...