Adaptive Temperature Scaling for Robust Calibration of Deep Neural Networks

07/31/2022
by   Sergio A. Balanya, et al.
3

In this paper, we study the post-hoc calibration of modern neural networks, a problem that has drawn a lot of attention in recent years. Many calibration methods of varying complexity have been proposed for the task, but there is no consensus about how expressive these should be. We focus on the task of confidence scaling, specifically on post-hoc methods that generalize Temperature Scaling, we call these the Adaptive Temperature Scaling family. We analyse expressive functions that improve calibration and propose interpretable methods. We show that when there is plenty of data complex models like neural networks yield better performance, but are prone to fail when the amount of data is limited, a common situation in certain post-hoc calibration applications like medical diagnosis. We study the functions that expressive methods learn under ideal conditions and design simpler methods but with a strong inductive bias towards these well-performing functions. Concretely, we propose Entropy-based Temperature Scaling, a simple method that scales the confidence of a prediction according to its entropy. Results show that our method obtains state-of-the-art performance when compared to others and, unlike complex models, it is robust against data scarcity. Moreover, our proposed model enables a deeper interpretation of the calibration process.

READ FULL TEXT

page 14

page 16

page 17

research
02/24/2021

Parameterized Temperature Scaling for Boosting the Expressive Power in Post-Hoc Uncertainty Calibration

We address the problem of uncertainty calibration and introduce a novel ...
research
08/30/2019

Bin-wise Temperature Scaling (BTS): Improvement in Confidence Calibration Performance through Simple Scaling Techniques

The prediction reliability of neural networks is important in many appli...
research
06/01/2023

A Uniform Confidence Phenomenon in Deep Learning and its Implications for Calibration

Despite the impressive generalization capabilities of deep neural networ...
research
07/13/2022

Sample-dependent Adaptive Temperature Scaling for Improved Calibration

It is now well known that neural networks can be wrong with high confide...
research
07/24/2023

Rethinking Data Distillation: Do Not Overlook Calibration

Neural networks trained on distilled data often produce over-confident o...
research
06/19/2023

Scaling of Class-wise Training Losses for Post-hoc Calibration

The class-wise training losses often diverge as a result of the various ...

Please sign up or login with your details

Forgot password? Click here to reset