Post-hoc Calibration of Neural Networks

06/23/2020
by   Amir Rahimi, et al.
23

Calibration of neural networks is a critical aspect to consider when incorporating machine learning models in real-world decision-making systems where the confidence of decisions are equally important as the decisions themselves. In recent years, there is a surge of research on neural network calibration and the majority of the works can be categorized into post-hoc calibration methods, defined as methods that learn an additional function to calibrate an already trained base network. In this work, we intend to understand the post-hoc calibration methods from a theoretical point of view. Especially, it is known that minimizing Negative Log-Likelihood (NLL) will lead to a calibrated network on the training set if the global optimum is attained (Bishop, 1994). Nevertheless, it is not clear learning an additional function in a post-hoc manner would lead to calibration in the theoretical sense. To this end, we prove that even though the base network (f) does not lead to the global optimum of NLL, by adding additional layers (g) and minimizing NLL by optimizing the parameters of g one can obtain a calibrated network g ∘ f. This not only provides a less stringent condition to obtain a calibrated network but also provides a theoretical justification of post-hoc calibration methods. Our experiments on various image classification benchmarks confirm the theory.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/20/2020

Post-hoc Uncertainty Calibration for Domain Drift Scenarios

We address the problem of uncertainty calibration. While standard deep n...
research
12/20/2021

Classifier Calibration: How to assess and improve predicted class probabilities: a survey

This paper provides both an introduction to and a detailed overview of t...
research
05/10/2021

Meta-Cal: Well-controlled Post-hoc Calibration by Ranking

In many applications, it is desirable that a classifier not only makes a...
research
02/10/2022

Heterogeneous Calibration: A post-hoc model-agnostic framework for improved generalization

We introduce the notion of heterogeneous calibration that applies a post...
research
07/13/2022

Estimating Classification Confidence Using Kernel Densities

This paper investigates the post-hoc calibration of confidence for "expl...
research
06/05/2023

A Large-Scale Study of Probabilistic Calibration in Neural Network Regression

Accurate probabilistic predictions are essential for optimal decision ma...
research
06/19/2023

Scaling of Class-wise Training Losses for Post-hoc Calibration

The class-wise training losses often diverge as a result of the various ...

Please sign up or login with your details

Forgot password? Click here to reset