Workload-Balanced Pruning for Sparse Spiking Neural Networks

02/13/2023
by   Ruokai Yin, et al.
0

Pruning for Spiking Neural Networks (SNNs) has emerged as a fundamental methodology for deploying deep SNNs on resource-constrained edge devices. Though the existing pruning methods can provide extremely high weight sparsity for deep SNNs, the high weight sparsity brings a workload imbalance problem. Specifically, the workload imbalance happens when a different number of non-zero weights are assigned to hardware units running in parallel, which results in low hardware utilization and thus imposes longer latency and higher energy costs. In preliminary experiments, we show that sparse SNNs (∼98 weight sparsity) can suffer as low as ∼59 workload imbalance problem, we propose u-Ticket, where we monitor and adjust the weight connections of the SNN during Lottery Ticket Hypothesis (LTH) based pruning, thus guaranteeing the final ticket gets optimal utilization when deployed onto the hardware. Experiments indicate that our u-Ticket can guarantee up to 100 and 63.8

READ FULL TEXT
research
10/12/2017

STDP Based Pruning of Connections and Weight Quantization in Spiking Neural Networks for Energy Efficient Recognition

Spiking Neural Networks (SNNs) with a large number of weights and varied...
research
07/04/2022

Lottery Ticket Hypothesis for Spiking Neural Networks

Spiking Neural Networks (SNNs) have recently emerged as a new generation...
research
05/11/2020

CSB-RNN: A Faster-than-Realtime RNN Acceleration Framework with Compressed Structured Blocks

Recurrent neural networks (RNNs) have been widely adopted in temporal se...
research
03/14/2022

Skydiver: A Spiking Neural Network Accelerator Exploiting Spatio-Temporal Workload Balance

Spiking Neural Networks (SNNs) are developed as a promising alternative ...
research
04/24/2023

Neurogenesis Dynamics-inspired Spiking Neural Network Training Acceleration

Biologically inspired Spiking Neural Networks (SNNs) have attracted sign...
research
05/11/2021

Pruning of Deep Spiking Neural Networks through Gradient Rewiring

Spiking Neural Networks (SNNs) have been attached great importance due t...
research
05/20/2023

Probabilistic Modeling: Proving the Lottery Ticket Hypothesis in Spiking Neural Network

The Lottery Ticket Hypothesis (LTH) states that a randomly-initialized l...

Please sign up or login with your details

Forgot password? Click here to reset