Improving Realistic Worst-Case Performance of NVCiM DNN Accelerators through Training with Right-Censored Gaussian Noise

07/29/2023
by   Zheyu Yan, et al.
0

Compute-in-Memory (CiM), built upon non-volatile memory (NVM) devices, is promising for accelerating deep neural networks (DNNs) owing to its in-situ data processing capability and superior energy efficiency. Unfortunately, the well-trained model parameters, after being mapped to NVM devices, can often exhibit large deviations from their intended values due to device variations, resulting in notable performance degradation in these CiM-based DNN accelerators. There exists a long list of solutions to address this issue. However, they mainly focus on improving the mean performance of CiM DNN accelerators. How to guarantee the worst-case performance under the impact of device variations, which is crucial for many safety-critical applications such as self-driving cars, has been far less explored. In this work, we propose to use the k-th percentile performance (KPP) to capture the realistic worst-case performance of DNN models executing on CiM accelerators. Through a formal analysis of the properties of KPP and the noise injection-based DNN training, we demonstrate that injecting a novel right-censored Gaussian noise, as opposed to the conventional Gaussian noise, significantly improves the KPP of DNNs. We further propose an automated method to determine the optimal hyperparameters for injecting this right-censored Gaussian noise during the training process. Our method achieves up to a 26 state-of-the-art methods employed to enhance DNN robustness under the impact of device variations.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/15/2022

Computing-In-Memory Neural Network Accelerators for Safety-Critical Systems: Can Small Device Variations Be Disastrous?

Computing-in-Memory (CiM) architectures based on emerging non-volatile m...
research
05/23/2023

Negative Feedback Training: A Novel Concept to Improve Robustness of NVCiM DNN Accelerators

Compute-in-Memory (CiM) utilizing non-volatile memory (NVM) devices pres...
research
05/25/2022

On the Reliability of Computing-in-Memory Accelerators for Deep Neural Networks

Computing-in-memory with emerging non-volatile memory (nvCiM) is shown t...
research
05/03/2022

MemSE: Fast MSE Prediction for Noisy Memristor-Based DNN Accelerators

Memristors enable the computation of matrix-vector multiplications (MVM)...
research
06/12/2023

On the Viability of using LLMs for SW/HW Co-Design: An Example in Designing CiM DNN Accelerators

Deep Neural Networks (DNNs) have demonstrated impressive performance acr...
research
11/09/2017

Stochastic Deep Learning in Memristive Networks

We study the performance of stochastically trained deep neural networks ...
research
09/22/2021

Security Analysis of Capsule Network Inference using Horizontal Collaboration

The traditional convolution neural networks (CNN) have several drawbacks...

Please sign up or login with your details

Forgot password? Click here to reset