DeepAI
Log In Sign Up

DCT-SNN: Using DCT to Distribute Spatial Information over Time for Learning Low-Latency Spiking Neural Networks

10/05/2020
by   Isha Garg, et al.
37

Spiking Neural Networks (SNNs) offer a promising alternative to traditional deep learning frameworks, since they provide higher computational efficiency due to event-driven information processing. SNNs distribute the analog values of pixel intensities into binary spikes over time. However, the most widely used input coding schemes, such as Poisson based rate-coding, do not leverage the additional temporal learning capability of SNNs effectively. Moreover, these SNNs suffer from high inference latency which is a major bottleneck to their deployment. To overcome this, we propose a scalable time-based encoding scheme that utilizes the Discrete Cosine Transform (DCT) to reduce the number of timesteps required for inference. DCT decomposes an image into a weighted sum of sinusoidal basis images. At each time step, the Hadamard product of the DCT coefficients and a single frequency base, taken in order, is given to an accumulator that generates spikes upon crossing a threshold. We use the proposed scheme to learn DCT-SNN, a low-latency deep SNN with leaky-integrate-and-fire neurons, trained using surrogate gradient descent based backpropagation. We achieve top-1 accuracy of 89.94 CIFAR-10, CIFAR-100 and TinyImageNet, respectively using VGG architectures. Notably, DCT-SNN performs inference with 2-14X reduced latency compared to other state-of-the-art SNNs, while achieving comparable accuracy to their standard deep learning counterparts. The dimension of the transform allows us to control the number of timesteps required for inference. Additionally, we can trade-off accuracy with latency in a principled manner by dropping the highest frequency components during inference.

READ FULL TEXT

page 4

page 5

page 14

08/09/2020

DIET-SNN: Direct Input Encoding With Leakage and Threshold Optimization in Deep Spiking Neural Networks

Bio-inspired spiking neural networks (SNNs), operating with asynchronous...
04/26/2021

Spatio-Temporal Pruning and Quantization for Low-latency Spiking Neural Networks

Spiking Neural Networks (SNNs) are a promising alternative to traditiona...
10/01/2021

One Timestep is All You Need: Training Spiking Neural Networks with Ultra Low Latency

Spiking Neural Networks (SNNs) are energy efficient alternatives to comm...
03/26/2020

T2FSNN: Deep Spiking Neural Networks with Time-to-first-spike Coding

Spiking neural networks (SNNs) have gained considerable interest due to ...
10/05/2020

Revisiting Batch Normalization for Training Low-latency Deep Spiking Neural Networks from Scratch

Spiking Neural Networks (SNNs) have recently emerged as an alternative t...
10/06/2021

Spike-inspired Rank Coding for Fast and Accurate Recurrent Neural Networks

Biological spiking neural networks (SNNs) can temporally encode informat...
12/06/2020

Rethinking FUN: Frequency-Domain Utilization Networks

The search for efficient neural network architectures has gained much fo...