Class-wise and reduced calibration methods

10/07/2022
by   Michael Panchenko, et al.
0

For many applications of probabilistic classifiers it is important that the predicted confidence vectors reflect true probabilities (one says that the classifier is calibrated). It has been shown that common models fail to satisfy this property, making reliable methods for measuring and improving calibration important tools. Unfortunately, obtaining these is far from trivial for problems with many classes. We propose two techniques that can be used in tandem. First, a reduced calibration method transforms the original problem into a simpler one. We prove for several notions of calibration that solving the reduced problem minimizes the corresponding notion of miscalibration in the full problem, allowing the use of non-parametric recalibration methods that fail in higher dimensions. Second, we propose class-wise calibration methods, based on intuition building on a phenomenon called neural collapse and the observation that most of the accurate classifiers found in practice can be thought of as a union of K different functions which can be recalibrated separately, one for each class. These typically out-perform their non class-wise counterparts, especially for classifiers trained on imbalanced data sets. Applying the two methods together results in class-wise reduced calibration algorithms, which are powerful tools for reducing the prediction and per-class calibration errors. We demonstrate our methods on real and synthetic datasets and release all code as open source at https://github.com/appliedAI-Initiative

READ FULL TEXT
research
07/18/2021

Top-label calibration

We study the problem of post-hoc calibration for multiclass classificati...
research
04/30/2023

Calibration Error Estimation Using Fuzzy Binning

Neural network-based decisions tend to be overconfident, where their raw...
research
01/14/2014

Binary Classifier Calibration: Non-parametric approach

Accurate calibration of probabilistic predictive models learned is criti...
research
08/06/2023

Two Sides of Miscalibration: Identifying Over and Under-Confidence Prediction for Network Calibration

Proper confidence calibration of deep neural networks is essential for r...
research
10/28/2019

Beyond temperature scaling: Obtaining well-calibrated multiclass probabilities with Dirichlet calibration

Class probabilities predicted by most multiclass classifiers are uncalib...
research
06/23/2020

Multi-Class Uncertainty Calibration via Mutual Information Maximization-based Binning

Post-hoc calibration is a common approach for providing high-quality con...
research
09/20/2018

Spline-Based Probability Calibration

In many classification problems it is desirable to output well-calibrate...

Please sign up or login with your details

Forgot password? Click here to reset