Evaluating Scalable Uncertainty Estimation Methods for DNN-Based Molecular Property Prediction

10/07/2019
by   Gabriele Scalia, et al.
0

Advances in deep neural network (DNN) based molecular property prediction have recently led to the development of models of remarkable accuracy and generalization ability, with graph convolution neural networks (GCNNs) reporting state-of-the-art performance for this task. However, some challenges remain and one of the most important that needs to be fully addressed concerns uncertainty quantification. DNN performance is affected by the volume and the quality of the training samples. Therefore, establishing when and to what extent a prediction can be considered reliable is just as important as outputting accurate predictions, especially when out-of-domain molecules are targeted. Recently, several methods to account for uncertainty in DNNs have been proposed, most of which are based on approximate Bayesian inference. Among these, only a few scale to the large datasets required in applications. Evaluating and comparing these methods has recently attracted great interest, but results are generally fragmented and absent for molecular property prediction. In this paper, we aim to quantitatively compare scalable techniques for uncertainty estimation in GCNNs. We introduce a set of quantitative criteria to capture different uncertainty aspects, and then use these criteria to compare MC-Dropout, deep ensembles, and bootstrapping, both theoretically in a unified framework that separates aleatoric/epistemic uncertainty and experimentally on the QM9 dataset. Our experiments quantify the performance of the different uncertainty estimation methods and their impact on uncertainty-related error reduction. Our findings indicate that ensembling and bootstrapping consistently outperform MC-Dropout, with different context-specific pros and cons. Our analysis also leads to a better understanding of the role of aleatoric/epistemic uncertainty and highlights the challenge posed by out-of-domain uncertainty.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/04/2019

Evaluating Scalable Bayesian Deep Learning Methods for Robust Computer Vision

While Deep Neural Networks (DNNs) have become the go-to approach in comp...
research
07/06/2023

Quantification of Uncertainty with Adversarial Models

Quantifying uncertainty is important for actionable predictions in real-...
research
07/13/2021

Calibrated Uncertainty for Molecular Property Prediction using Ensembles of Message Passing Neural Networks

Data-driven methods based on machine learning have the potential to acce...
research
04/29/2021

Bayesian Deep Networks for Supervised Single-View Depth Learning

Uncertainty quantification is a key aspect in robotic perception, as ove...
research
10/12/2021

Robust Neural Regression via Uncertainty Learning

Deep neural networks tend to underestimate uncertainty and produce overl...
research
07/15/2021

Randomized ReLU Activation for Uncertainty Estimation of Deep Neural Networks

Deep neural networks (DNNs) have successfully learned useful data repres...
research
07/11/2023

Predicting small molecules solubilities on endpoint devices using deep ensemble neural networks

Aqueous solubility is a valuable yet challenging property to predict. Co...

Please sign up or login with your details

Forgot password? Click here to reset