HW/SW Codesign for Robust and Efficient Binarized SNNs by Capacitor Minimization

09/05/2023
by   Mikail Yayla, et al.
0

Using accelerators based on analog computing is an efficient way to process the immensely large workloads in Neural Networks (NNs). One example of an analog computing scheme for NNs is Integrate-and-Fire (IF) Spiking Neural Networks (SNNs). However, to achieve high inference accuracy in IF-SNNs, the analog hardware needs to represent current-based multiply-accumulate (MAC) levels as spike times, for which a large membrane capacitor needs to be charged for a certain amount of time. A large capacitor results in high energy use, considerable area cost, and long latency, constituting one of the major bottlenecks in analog IF-SNN implementations. In this work, we propose a HW/SW Codesign method, called CapMin, for capacitor size minimization in analog computing IF-SNNs. CapMin minimizes the capacitor size by reducing the number of spike times needed for accurate operation of the HW, based on the absolute frequency of MAC level occurrences in the SW. To increase the operation of IF-SNNs to current variation, we propose the method CapMin-V, which trades capacitor size for protection based on the reduced capacitor size found in CapMin. In our experiments, CapMin achieves more than a 14× reduction in capacitor size over the state of the art, while CapMin-V achieves increased variation tolerance in the IF-SNN operation, requiring only a small increase in capacitor size.

READ FULL TEXT
research
12/15/2022

Exact Error Backpropagation Through Spikes for Precise Training of Spiking Neural Networks

Event-based simulations of Spiking Neural Networks (SNNs) are fast and a...
research
01/14/2020

Noisy Machines: Understanding Noisy Neural Networks and Enhancing Robustness to Analog Hardware Errors Using Distillation

The success of deep learning has brought forth a wave of interest in com...
research
06/11/2023

An Efficient and Accurate Memristive Memory for Array-based Spiking Neural Networks

Memristors provide a tempting solution for weighted synapse connections ...
research
08/09/2020

DIET-SNN: Direct Input Encoding With Leakage and Threshold Optimization in Deep Spiking Neural Networks

Bio-inspired spiking neural networks (SNNs), operating with asynchronous...
research
05/27/2023

Input-Aware Dynamic Timestep Spiking Neural Networks for Efficient In-Memory Computing

Spiking Neural Networks (SNNs) have recently attracted widespread resear...
research
05/09/2022

Hardware-Robust In-RRAM-Computing for Object Detection

In-memory computing is becoming a popular architecture for deep-learning...
research
03/29/2021

Demonstrating Analog Inference on the BrainScaleS-2 Mobile System

We present the BrainScaleS-2 mobile system as a compact analog inference...

Please sign up or login with your details

Forgot password? Click here to reset