Entropy-Based Modeling for Estimating Soft Errors Impact on Binarized Neural Network Inference

04/10/2020
by   Navid Khoshavi, et al.
0

Over past years, the easy accessibility to the large scale datasets has significantly shifted the paradigm for developing highly accurate prediction models that are driven from Neural Network (NN). These models can be potentially impacted by the radiation-induced transient faults that might lead to the gradual downgrade of the long-running expected NN inference accelerator. The crucial observation from our rigorous vulnerability assessment on the NN inference accelerator demonstrates that the weights and activation functions are unevenly susceptible to both single-event upset (SEU) and multi-bit upset (MBU), especially in the first five layers of our selected convolution neural network. In this paper, we present the relatively-accurate statistical models to delineate the impact of both undertaken SEU and MBU across layers and per each layer of the selected NN. These models can be used for evaluating the error-resiliency magnitude of NN topology before adopting them in the safety-critical applications.

READ FULL TEXT

page 1

page 6

research
10/12/2022

Statistical Modeling of Soft Error Influence on Neural Networks

Soft errors in large VLSI circuits pose dramatic influence on computing-...
research
03/29/2021

Tuning of extended state observer with neural network-based control performance assessment

The extended state observer (ESO) is an inherent element of robust obser...
research
03/05/2020

Compiling Neural Networks for a Computational Memory Accelerator

Computational memory (CM) is a promising approach for accelerating infer...
research
06/14/2018

On the Resilience of RTL NN Accelerators: Fault Characterization and Mitigation

Machine Learning (ML) is making a strong resurgence in tune with the mas...
research
09/07/2022

Hardware faults that matter: Understanding and Estimating the safety impact of hardware faults on object detection DNNs

Object detection neural network models need to perform reliably in highl...
research
04/19/2021

Arithmetic-Intensity-Guided Fault Tolerance for Neural Network Inference on GPUs

Neural networks (NNs) are increasingly employed in domains that require ...
research
09/12/2023

Neural Network Layer Matrix Decomposition reveals Latent Manifold Encoding and Memory Capacity

We prove the converse of the universal approximation theorem, i.e. a neu...

Please sign up or login with your details

Forgot password? Click here to reset