Single-model uncertainty quantification in neural network potentials does not consistently outperform model ensembles

05/02/2023
by   Aik Rui Tan, et al.
0

Neural networks (NNs) often assign high confidence to their predictions, even for points far out-of-distribution, making uncertainty quantification (UQ) a challenge. When they are employed to model interatomic potentials in materials systems, this problem leads to unphysical structures that disrupt simulations, or to biased statistics and dynamics that do not reflect the true physics. Differentiable UQ techniques can find new informative data and drive active learning loops for robust potentials. However, a variety of UQ techniques, including newly developed ones, exist for atomistic simulations and there are no clear guidelines for which are most effective or suitable for a given case. In this work, we examine multiple UQ schemes for improving the robustness of NN interatomic potentials (NNIPs) through active learning. In particular, we compare incumbent ensemble-based methods against strategies that use single, deterministic NNs: mean-variance estimation, deep evidential regression, and Gaussian mixture models. We explore three datasets ranging from in-domain interpolative learning to more extrapolative out-of-domain generalization challenges: rMD17, ammonia inversion, and bulk silica glass. Performance is measured across multiple metrics relating model error to uncertainty. Our experiments show that none of the methods consistently outperformed each other across the various metrics. Ensembling remained better at generalization and for NNIP robustness; MVE only proved effective for in-domain interpolation, while GMM was better out-of-domain; and evidential regression, despite its promise, was not the preferable alternative in any of the cases. More broadly, cost-effective, single deterministic models cannot yet consistently match or outperform ensembling for uncertainty quantification in NNIPs.

READ FULL TEXT

page 12

page 35

page 36

page 40

page 41

research
01/27/2021

Adversarial Attacks on Uncertainty Enable Active Learning for Neural Network Potentials

Neural network (NN)-based interatomic potentials provide fast prediction...
research
12/15/2022

Scalable Bayesian Uncertainty Quantification for Neural Network Potentials: Promise and Pitfalls

Neural network (NN) potentials promise highly accurate molecular dynamic...
research
02/28/2023

Toward Robust Uncertainty Estimation with Random Activation Functions

Deep neural networks are in the limelight of machine learning with their...
research
06/02/2020

Committee neural network potentials control generalization errors and enable active learning

It is well known in the field of machine learning that committee models ...
research
03/04/2020

Simple and Scalable Epistemic Uncertainty Estimation Using a Single Deep Deterministic Neural Network

We propose a method for training a deterministic deep model that can fin...
research
03/13/2023

Validation of uncertainty quantification metrics: a primer based on the consistency and adaptivity concepts

The practice of uncertainty quantification (UQ) validation, notably in m...

Please sign up or login with your details

Forgot password? Click here to reset