A Consistent and Differentiable Lp Canonical Calibration Error Estimator

10/13/2022
by   Teodora Popordanoska, et al.
0

Calibrated probabilistic classifiers are models whose predicted probabilities can directly be interpreted as uncertainty estimates. It has been shown recently that deep neural networks are poorly calibrated and tend to output overconfident predictions. As a remedy, we propose a low-bias, trainable calibration error estimator based on Dirichlet kernel density estimates, which asymptotically converges to the true L_p calibration error. This novel estimator enables us to tackle the strongest notion of multiclass calibration, called canonical (or distribution) calibration, while other common calibration methods are tractable only for top-label and marginal calibration. The computational complexity of our estimator is 𝒪(n^2), the convergence rate is 𝒪(n^-1/2), and it is unbiased up to 𝒪(n^-2), achieved by a geometric series debiasing scheme. In practice, this means that the estimator can be applied to small subsets of data, enabling efficient estimation and mini-batch updates. The proposed method has a natural choice of kernel, and can be used to generate consistent estimates of other quantities based on conditional expectation, such as the sharpness of a probabilistic classifier. Empirical results validate the correctness of our estimator, and demonstrate its utility in canonical calibration error estimation and calibration error regularized risk minimization.

READ FULL TEXT

page 6

page 29

research
05/26/2020

Improving Regression Uncertainty Estimates with an Empirical Prior

While machine learning models capable of producing uncertainty estimates...
research
10/30/2019

Heteroscedastic Calibration of Uncertainty Estimators in Deep Learning

The role of uncertainty quantification (UQ) in deep learning has become ...
research
03/27/2023

Meta-Calibration Regularized Neural Networks

Miscalibration-the mismatch between predicted probability and the true c...
research
06/08/2023

Beyond Probability Partitions: Calibrating Neural Networks with Semantic Aware Grouping

Research has shown that deep networks tend to be overly optimistic about...
research
03/17/2020

A Unified View of Label Shift Estimation

Label shift describes the setting where although the label distribution ...
research
03/23/2023

Uncertainty Calibration for Counterfactual Propensity Estimation in Recommendation

In recommendation systems, a large portion of the ratings are missing du...
research
03/04/2023

ESD: Expected Squared Difference as a Tuning-Free Trainable Calibration Measure

Studies have shown that modern neural networks tend to be poorly calibra...

Please sign up or login with your details

Forgot password? Click here to reset