Improving Reliability of Spiking Neural Networks through Fault Aware Threshold Voltage Optimization

01/12/2023
by   Ayesha Siddique, et al.
10

Spiking neural networks have made breakthroughs in computer vision by lending themselves to neuromorphic hardware. However, the neuromorphic hardware lacks parallelism and hence, limits the throughput and hardware acceleration of SNNs on edge devices. To address this problem, many systolic-array SNN accelerators (systolicSNNs) have been proposed recently, but their reliability is still a major concern. In this paper, we first extensively analyze the impact of permanent faults on the SystolicSNNs. Then, we present a novel fault mitigation method, i.e., fault-aware threshold voltage optimization in retraining (FalVolt). FalVolt optimizes the threshold voltage for each layer in retraining to achieve the classification accuracy close to the baseline in the presence of faults. To demonstrate the effectiveness of our proposed mitigation, we classify both static (i.e., MNIST) and neuromorphic datasets (i.e., N-MNIST and DVS Gesture) on a 256x256 systolicSNN with stuck-at faults. We empirically show that the classification accuracy of a systolicSNN drops significantly even at extremely low fault rates (as low as 0.012%). Our proposed FalVolt mitigation method improves the performance of systolicSNNs by enabling them to operate at fault rates of up to 60%, with a negligible drop in classification accuracy (as low as 0.1%). Our results show that FalVolt is 2x faster compared to other state-of-the-art techniques common in artificial neural networks (ANNs), such as fault-aware pruning and retraining without threshold voltage optimization.

READ FULL TEXT

page 1

page 5

page 6

research
01/12/2023

Exposing Reliability Degradation and Mitigation in Approximate DNNs under Permanent Faults

Approximate computing is known for enhancing deep neural network acceler...
research
02/11/2018

Analyzing and Mitigating the Impact of Permanent Faults on a Systolic Array Based Neural Network Accelerator

Due to their growing popularity and computational cost, deep neural netw...
research
08/23/2021

ReSpawn: Energy-Efficient Fault-Tolerance for Spiking Neural Networks considering Unreliable Memories

Spiking neural networks (SNNs) have shown a potential for having low ene...
research
04/08/2023

RescueSNN: Enabling Reliable Executions on Spiking Neural Network Accelerators under Permanent Faults

To maximize the performance and energy efficiency of Spiking Neural Netw...
research
07/31/2022

enpheeph: A Fault Injection Framework for Spiking and Compressed Deep Neural Networks

Research on Deep Neural Networks (DNNs) has focused on improving perform...
research
04/10/2022

Analysis of Power-Oriented Fault Injection Attacks on Spiking Neural Networks

Spiking Neural Networks (SNN) are quickly gaining traction as a viable a...
research
03/10/2022

SoftSNN: Low-Cost Fault Tolerance for Spiking Neural Network Accelerators under Soft Errors

Specialized hardware accelerators have been designed and employed to max...

Please sign up or login with your details

Forgot password? Click here to reset