Heteroscedastic Calibration of Uncertainty Estimators in Deep Learning

10/30/2019
by   Bindya Venkatesh, et al.
0

The role of uncertainty quantification (UQ) in deep learning has become crucial with growing use of predictive models in high-risk applications. Though a large class of methods exists for measuring deep uncertainties, in practice, the resulting estimates are found to be poorly calibrated, thus making it challenging to translate them into actionable insights. A common workaround is to utilize a separate recalibration step, which adjusts the estimates to compensate for the miscalibration. Instead, we propose to repurpose the heteroscedastic regression objective as a surrogate for calibration and enable any existing uncertainty estimator to be inherently calibrated. In addition to eliminating the need for recalibration, this also regularizes the training process. Using regression experiments, we demonstrate the effectiveness of the proposed heteroscedastic calibration with two popular uncertainty estimators.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/14/2020

Improving model calibration with accuracy versus uncertainty optimization

Obtaining reliable and accurate quantification of uncertainty estimates ...
research
09/09/2019

Building Calibrated Deep Models via Uncertainty Matching with Auxiliary Interval Predictors

With rapid adoption of deep learning in high-regret applications, the qu...
research
12/14/2021

Calibrated and Sharp Uncertainties in Deep Learning via Simple Density Estimation

Predictive uncertainties can be characterized by two properties–calibrat...
research
04/27/2020

Calibrating Healthcare AI: Towards Reliable and Interpretable Deep Predictive Models

The wide-spread adoption of representation learning technologies in clin...
research
10/13/2022

A Consistent and Differentiable Lp Canonical Calibration Error Estimator

Calibrated probabilistic classifiers are models whose predicted probabil...
research
10/30/2019

Learn-By-Calibrating: Using Calibration as a Training Objective

Calibration error is commonly adopted for evaluating the quality of unce...
research
10/23/2022

A study of uncertainty quantification in overparametrized high-dimensional models

Uncertainty quantification is a central challenge in reliable and trustw...

Please sign up or login with your details

Forgot password? Click here to reset