Single-shot Bayesian approximation for neural networks

08/24/2023
by   Kai Brach, et al.
0

Deep neural networks (NNs) are known for their high-prediction performances. However, NNs are prone to yield unreliable predictions when encountering completely new situations without indicating their uncertainty. Bayesian variants of NNs (BNNs), such as Monte Carlo (MC) dropout BNNs, do provide uncertainty measures and simultaneously increase the prediction performance. The only disadvantage of BNNs is their higher computation time during test time because they rely on a sampling approach. Here we present a single-shot MC dropout approximation that preserves the advantages of BNNs while being as fast as NNs. Our approach is based on moment propagation (MP) and allows to analytically approximate the expected value and the variance of the MC dropout signal for commonly used layers in NNs, i.e. convolution, max pooling, dense, softmax, and dropout layers. The MP approach can convert an NN into a BNN without re-training given the NN has been trained with standard dropout. We evaluate our approach on different benchmark datasets and a simulated toy example in a classification and regression setting. We demonstrate that our single-shot MC dropout approximation resembles the point estimate and the uncertainty estimate of the predictive distribution that is achieved with an MC approach, while being fast enough for real-time deployments of BNNs. We show that using part of the saved time to combine our MP approach with deep ensemble techniques does further improve the uncertainty measures.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/07/2020

Single Shot MC Dropout Approximation

Deep neural networks (DNNs) are known for their high prediction performa...
research
03/29/2021

Rapid Risk Minimization with Bayesian Models Through Deep Learning Approximation

In this paper, we introduce a novel combination of Bayesian Models (BMs)...
research
01/31/2020

Fast Monte Carlo Dropout and Error Correction for Radio Transmitter Classification

Monte Carlo dropout may effectively capture model uncertainty in deep le...
research
04/18/2021

Distributed NLI: Learning to Predict Human Opinion Distributions for Language Reasoning

We introduce distributed NLI, a new NLU task with a goal to predict the ...
research
02/10/2023

Satellite Anomaly Detection Using Variance Based Genetic Ensemble of Neural Networks

In this paper, we use a variance-based genetic ensemble (VGE) of Neural ...
research
01/12/2023

SACDNet: Towards Early Type 2 Diabetes Prediction with Uncertainty for Electronic Health Records

Type 2 diabetes mellitus (T2DM) is one of the most common diseases and a...
research
11/27/2022

An Anomaly Detection Method for Satellites Using Monte Carlo Dropout

Recently, there has been a significant amount of interest in satellite t...

Please sign up or login with your details

Forgot password? Click here to reset