Improved Predictive Uncertainty using Corruption-based Calibration

06/07/2021
by   Tiago Salvador, et al.
0

We propose a simple post hoc calibration method to estimate the confidence/uncertainty that a model prediction is correct on data with covariate shift, as represented by the large-scale corrupted data benchmark [Ovadia et al, 2019]. We achieve this by synthesizing surrogate calibration sets by corrupting the calibration set with varying intensities of a known corruption. Our method demonstrates significant improvements on the benchmark on a wide range of covariate shifts.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/20/2020

Post-hoc Uncertainty Calibration for Domain Drift Scenarios

We address the problem of uncertainty calibration. While standard deep n...
research
10/28/2021

Exploring Covariate and Concept Shift for Detection and Calibration of Out-of-Distribution Data

Moving beyond testing on in-distribution data works on Out-of-Distributi...
research
05/19/2022

Calibration Matters: Tackling Maximization Bias in Large-scale Advertising Recommendation Systems

Calibration is defined as the ratio of the average predicted click rate ...
research
06/19/2020

Evaluating Prediction-Time Batch Normalization for Robustness under Covariate Shift

Covariate shift has been shown to sharply degrade both predictive accura...
research
05/28/2019

Evaluating and Calibrating Uncertainty Prediction in Regression Tasks

Predicting not only the target but also an accurate measure of uncertain...
research
03/08/2023

HappyMap: A Generalized Multi-calibration Method

Multi-calibration is a powerful and evolving concept originating in the ...
research
07/13/2022

Estimating Classification Confidence Using Kernel Densities

This paper investigates the post-hoc calibration of confidence for "expl...

Please sign up or login with your details

Forgot password? Click here to reset