Calibration and Uncertainty Quantification of Bayesian Convolutional Neural Networks for Geophysical Applications

05/25/2021
by   Lukas Mosser, et al.
10

Deep neural networks offer numerous potential applications across geoscience, for example, one could argue that they are the state-of-the-art method for predicting faults in seismic datasets. In quantitative reservoir characterization workflows, it is common to incorporate the uncertainty of predictions thus such subsurface models should provide calibrated probabilities and the associated uncertainties in their predictions. It has been shown that popular Deep Learning-based models are often miscalibrated, and due to their deterministic nature, provide no means to interpret the uncertainty of their predictions. We compare three different approaches to obtaining probabilistic models based on convolutional neural networks in a Bayesian formalism, namely Deep Ensembles, Concrete Dropout, and Stochastic Weight Averaging-Gaussian (SWAG). These methods are consistently applied to fault detection case studies where Deep Ensembles use independently trained models to provide fault probabilities, Concrete Dropout represents an extension to the popular Dropout technique to approximate Bayesian neural networks, and finally, we apply SWAG, a recent method that is based on the Bayesian inference equivalence of mini-batch Stochastic Gradient Descent. We provide quantitative results in terms of model calibration and uncertainty representation, as well as qualitative results on synthetic and real seismic datasets. Our results show that the approximate Bayesian methods, Concrete Dropout and SWAG, both provide well-calibrated predictions and uncertainty attributes at a lower computational cost when compared to the baseline Deep Ensemble approach. The resulting uncertainties also offer a possibility to further improve the model performance as well as enhancing the interpretability of the models.

READ FULL TEXT

page 10

page 17

page 18

page 19

page 20

page 21

page 22

research
06/23/2019

Confidence Calibration for Convolutional Neural Networks Using Structured Dropout

In classification applications, we often want probabilistic predictions ...
research
11/17/2022

Fast Uncertainty Estimates in Deep Learning Interatomic Potentials

Deep learning has emerged as a promising paradigm to give access to high...
research
01/22/2020

On Last-Layer Algorithms for Classification: Decoupling Representation from Uncertainty Estimation

Uncertainty quantification for deep learning is a challenging open probl...
research
03/11/2020

Estimation of Accurate and Calibrated Uncertainties in Deterministic models

In this paper we focus on the problem of assigning uncertainties to sing...
research
02/26/2022

A Deep Bayesian Neural Network for Cardiac Arrhythmia Classification with Rejection from ECG Recordings

With the development of deep learning-based methods, automated classific...
research
05/22/2017

Concrete Dropout

Dropout is used as a practical tool to obtain uncertainty estimates in l...
research
06/20/2021

On Stein Variational Neural Network Ensembles

Ensembles of deep neural networks have achieved great success recently, ...

Please sign up or login with your details

Forgot password? Click here to reset