Dual Focal Loss for Calibration

05/23/2023
by   Linwei Tao, et al.
0

The use of deep neural networks in real-world applications require well-calibrated networks with confidence scores that accurately reflect the actual probability. However, it has been found that these networks often provide over-confident predictions, which leads to poor calibration. Recent efforts have sought to address this issue by focal loss to reduce over-confidence, but this approach can also lead to under-confident predictions. While different variants of focal loss have been explored, it is difficult to find a balance between over-confidence and under-confidence. In our work, we propose a new loss function by focusing on dual logits. Our method not only considers the ground truth logit, but also take into account the highest logit ranked after the ground truth logit. By maximizing the gap between these two logits, our proposed dual focal loss can achieve a better balance between over-confidence and under-confidence. We provide theoretical evidence to support our approach and demonstrate its effectiveness through evaluations on multiple models and datasets, where it achieves state-of-the-art performance. Code is available at https://github.com/Linwei94/DualFocalLoss

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/21/2021

Bayesian Confidence Calibration for Epistemic Uncertainty Modelling

Modern neural networks have found to be miscalibrated in terms of confid...
research
09/28/2018

Confidence Calibration in Deep Neural Networks through Stochastic Inferences

We propose a generic framework to calibrate accuracy and confidence (sco...
research
06/03/2022

On Calibration of Graph Neural Networks for Node Classification

Graphs can model real-world, complex systems by representing entities an...
research
12/03/2019

Distance-Based Learning from Errors for Confidence Calibration

Deep neural networks (DNNs) are poorly-calibrated when trained in conven...
research
03/06/2023

Rethinking Confidence Calibration for Failure Prediction

Reliable confidence estimation for the predictions is important in many ...
research
09/06/2023

Multiclass Alignment of Confidence and Certainty for Network Calibration

Deep neural networks (DNNs) have made great strides in pushing the state...
research
03/21/2023

PRISE: Demystifying Deep Lucas-Kanade with Strongly Star-Convex Constraints for Multimodel Image Alignment

The Lucas-Kanade (LK) method is a classic iterative homography estimatio...

Please sign up or login with your details

Forgot password? Click here to reset