IMPULSE: A 65nm Digital Compute-in-Memory Macro with Fused Weights and Membrane Potential for Spike-based Sequential Learning Tasks

05/18/2021
by   Amogh Agrawal, et al.
0

The inherent dynamics of the neuron membrane potential in Spiking Neural Networks (SNNs) allows processing of sequential learning tasks, avoiding the complexity of recurrent neural networks. The highly-sparse spike-based computations in such spatio-temporal data can be leveraged for energy-efficiency. However, the membrane potential incurs additional memory access bottlenecks in current SNN hardware. To that effect, we propose a 10T-SRAM compute-in-memory (CIM) macro, specifically designed for state-of-the-art SNN inference. It consists of a fused weight (WMEM) and membrane potential (VMEM) memory and inherently exploits sparsity in input spikes leading to 97.4 (typical of SNNs considered in this work) compared to the case of no sparsity. We propose staggered data mapping and reconfigurable peripherals for handling different bit-precision requirements of WMEM and VMEM, while supporting multiple neuron functionalities. The proposed macro was fabricated in 65nm CMOS technology, achieving an energy-efficiency of 0.99TOPS/W at 0.85V supply and 200MHz frequency for signed 11-bit operations. We evaluate the SNN for sentiment classification from the IMDB dataset of movie reviews and achieve within 1

READ FULL TEXT

page 2

page 3

page 4

research
08/16/2023

Inherent Redundancy in Spiking Neural Networks

Spiking Neural Networks (SNNs) are well known as a promising energy-effi...
research
04/29/2021

Hessian Aware Quantization of Spiking Neural Networks

To achieve the low latency, high throughput, and energy efficiency benef...
research
02/10/2020

A Spike in Performance: Training Hybrid-Spiking Neural Networks with Quantized Activation Functions

The machine learning community has become increasingly interested in the...
research
03/14/2022

Skydiver: A Spiking Neural Network Accelerator Exploiting Spatio-Temporal Workload Balance

Spiking Neural Networks (SNNs) are developed as a promising alternative ...
research
07/06/2021

CAP-RAM: A Charge-Domain In-Memory Computing 6T-SRAM for Accurate and Precision-Programmable CNN Inference

A compact, accurate, and bitwidth-programmable in-memory computing (IMC)...
research
09/06/2023

Are SNNs Truly Energy-efficient? - A Hardware Perspective

Spiking Neural Networks (SNNs) have gained attention for their energy-ef...
research
02/01/2023

Bit-balance: Model-Hardware Co-design for Accelerating NNs by Exploiting Bit-level Sparsity

Bit-serial architectures can handle Neural Networks (NNs) with different...

Please sign up or login with your details

Forgot password? Click here to reset