Simple and Scalable Epistemic Uncertainty Estimation Using a Single Deep Deterministic Neural Network

03/04/2020
by   Joost van Amersfoort, et al.
22

We propose a method for training a deterministic deep model that can find and reject out of distribution data points at test time with a single forward pass. Our approach, deterministic uncertainty quantification (DUQ), builds upon ideas of RBF networks. We scale training in these with a novel loss function and centroid updating scheme. By enforcing detectability of changes in the input using a gradient penalty, we are able to reliably detect out of distribution data. Our uncertainty quantification scales well to large datasets, and using a single model, we improve upon or match Deep Ensembles on notable difficult dataset pairs such as FashionMNIST vs. MNIST, and CIFAR-10 vs. SVHN, while maintaining competitive accuracy.

READ FULL TEXT

page 1

page 5

page 11

research
08/21/2023

Deep Evidential Learning for Bayesian Quantile Regression

It is desirable to have accurate uncertainty estimation from a single de...
research
02/07/2022

NUQ: Nonparametric Uncertainty Quantification for Deterministic Neural Networks

This paper proposes a fast and scalable method for uncertainty quantific...
research
02/22/2021

Improving Deterministic Uncertainty Estimation in Deep Learning for Classification and Regression

We propose a new model that estimates uncertainty in a single forward pa...
research
03/06/2022

Scalable Uncertainty Quantification for Deep Operator Networks using Randomized Priors

We present a simple and effective approach for posterior uncertainty qua...
research
11/18/2021

Exploring the Limits of Epistemic Uncertainty Quantification in Low-Shot Settings

Uncertainty quantification in neural network promises to increase safety...
research
02/23/2021

Deterministic Neural Networks with Appropriate Inductive Biases Capture Epistemic and Aleatoric Uncertainty

We show that a single softmax neural net with minimal changes can beat t...
research
05/02/2023

Single-model uncertainty quantification in neural network potentials does not consistently outperform model ensembles

Neural networks (NNs) often assign high confidence to their predictions,...

Please sign up or login with your details

Forgot password? Click here to reset