MC-CIM: Compute-in-Memory with Monte-Carlo Dropouts for Bayesian Edge Intelligence

11/13/2021
by   Priyesh Shukla, et al.
3

We propose MC-CIM, a compute-in-memory (CIM) framework for robust, yet low power, Bayesian edge intelligence. Deep neural networks (DNN) with deterministic weights cannot express their prediction uncertainties, thereby pose critical risks for applications where the consequences of mispredictions are fatal such as surgical robotics. To address this limitation, Bayesian inference of a DNN has gained attention. Using Bayesian inference, not only the prediction itself, but the prediction confidence can also be extracted for planning risk-aware actions. However, Bayesian inference of a DNN is computationally expensive, ill-suited for real-time and/or edge deployment. An approximation to Bayesian DNN using Monte Carlo Dropout (MC-Dropout) has shown high robustness along with low computational complexity. Enhancing the computational efficiency of the method, we discuss a novel CIM module that can perform in-memory probabilistic dropout in addition to in-memory weight-input scalar product to support the method. We also propose a compute-reuse reformulation of MC-Dropout where each successive instance can utilize the product-sum computations from the previous iteration. Even more, we discuss how the random instances can be optimally ordered to minimize the overall MC-Dropout workload by exploiting combinatorial optimization methods. Application of the proposed CIM-based MC-Dropout execution is discussed for MNIST character recognition and visual odometry (VO) of autonomous drones. The framework reliably gives prediction confidence amidst non-idealities imposed by MC-CIM to a good extent. Proposed MC-CIM with 16x31 SRAM array, 0.85 V supply, 16nm low-standby power (LSTP) technology consumes 27.8 pJ for 30 MC-Dropout instances of probabilistic inference in its most optimal computing and peripheral configuration, saving 43

READ FULL TEXT

page 1

page 2

page 3

page 4

page 5

page 7

research
07/03/2020

Qualitative Analysis of Monte Carlo Dropout

In this report, we present qualitative analysis of Monte Carlo (MC) drop...
research
03/03/2023

Lightweight, Uncertainty-Aware Conformalized Visual Odometry

Data-driven visual odometry (VO) is a critical subroutine for autonomous...
research
06/09/2021

Ex uno plures: Splitting One Model into an Ensemble of Subnetworks

Monte Carlo (MC) dropout is a simple and efficient ensembling method tha...
research
07/18/2021

Compressed Monte Carlo with application in particle filtering

Bayesian models have become very popular over the last years in several ...
research
10/08/2021

Is MC Dropout Bayesian?

MC Dropout is a mainstream "free lunch" method in medical imaging for ap...
research
03/15/2022

Self-Normalized Density Map (SNDM) for Counting Microbiological Objects

The statistical properties of the density map (DM) approach to counting ...
research
08/28/2018

Using Monte Carlo dropout for non-stationary noise reduction from speech

In this work, we propose the use of dropout as a Bayesian estimator for ...

Please sign up or login with your details

Forgot password? Click here to reset