Uncertainty Estimation with Infinitesimal Jackknife, Its Distribution and Mean-Field Approximation

06/13/2020
by   Zhiyun Lu, et al.
31

Uncertainty quantification is an important research area in machine learning. Many approaches have been developed to improve the representation of uncertainty in deep models to avoid overconfident predictions. Existing ones such as Bayesian neural networks and ensemble methods require modifications to the training procedures and are computationally costly for both training and inference. Motivated by this, we propose mean-field infinitesimal jackknife (mfIJ) – a simple, efficient, and general-purpose plug-in estimator for uncertainty estimation. The main idea is to use infinitesimal jackknife, a classical tool from statistics for uncertainty estimation to construct a pseudo-ensemble that can be described with a closed-form Gaussian distribution, without retraining. We then use this Gaussian distribution for uncertainty estimation. While the standard way is to sample models from this distribution and combine each sample's prediction, we develop a mean-field approximation to the inference where Gaussian random variables need to be integrated with the softmax nonlinear functions to generate probabilities for multinomial variables. The approach has many appealing properties: it functions as an ensemble without requiring multiple models, and it enables closed-form approximate inference using only the first and second moments of Gaussians. Empirically, mfIJ performs competitively when compared to state-of-the-art methods, including deep ensembles, temperature scaling, dropout and Bayesian NNs, on important uncertainty tasks. It especially outperforms many methods on out-of-distribution detection.

READ FULL TEXT
research
06/10/2022

Dynamic mean field programming

A dynamic mean field theory is developed for model based Bayesian reinfo...
research
02/28/2023

Toward Robust Uncertainty Estimation with Random Activation Functions

Deep neural networks are in the limelight of machine learning with their...
research
09/16/2021

Assessments of model-form uncertainty using Gaussian stochastic weight averaging for fluid-flow regression

We use Gaussian stochastic weight averaging (SWAG) to assess the model-f...
research
03/29/2022

Learning Structured Gaussians to Approximate Deep Ensembles

This paper proposes using a sparse-structured multivariate Gaussian to p...
research
08/31/2020

Direct Volume Rendering with Nonparametric Models of Uncertainty

We present a nonparametric statistical framework for the quantification,...
research
06/16/2023

Collapsed Inference for Bayesian Deep Learning

Bayesian neural networks (BNNs) provide a formalism to quantify and cali...
research
06/05/2022

Functional Ensemble Distillation

Bayesian models have many desirable properties, most notable is their ab...

Please sign up or login with your details

Forgot password? Click here to reset