A Large-Scale Study of Probabilistic Calibration in Neural Network Regression

06/05/2023
by   Victor Dheur, et al.
0

Accurate probabilistic predictions are essential for optimal decision making. While neural network miscalibration has been studied primarily in classification, we investigate this in the less-explored domain of regression. We conduct the largest empirical study to date to assess the probabilistic calibration of neural networks. We also analyze the performance of recalibration, conformal, and regularization methods to enhance probabilistic calibration. Additionally, we introduce novel differentiable recalibration and regularization methods, uncovering new insights into their effectiveness. Our findings reveal that regularization methods offer a favorable tradeoff between calibration and sharpness. Post-hoc methods exhibit superior probabilistic calibration, which we attribute to the finite-sample coverage guarantee of conformal prediction. Furthermore, we demonstrate that quantile recalibration can be considered as a specific case of conformal prediction. Our study is fully reproducible and implemented in a common code base for fair comparisons.

READ FULL TEXT

page 5

page 8

page 17

page 18

page 19

page 20

page 22

research
09/29/2022

Bayesian Neural Network Versus Ex-Post Calibration For Prediction Uncertainty

Probabilistic predictions from neural networks which account for predict...
research
05/15/2019

Distribution Calibration for Regression

We are concerned with obtaining well-calibrated output distributions fro...
research
04/28/2022

Probabilistic Models for Manufacturing Lead Times

In this study, we utilize Gaussian processes, probabilistic neural netwo...
research
02/26/2021

Learning Prediction Intervals for Regression: Generalization and Calibration

We study the generation of prediction intervals in regression for uncert...
research
06/23/2020

Post-hoc Calibration of Neural Networks

Calibration of neural networks is a critical aspect to consider when inc...
research
02/28/2020

Quantile Regularization: Towards Implicit Calibration of Regression Models

Recent works have shown that most deep learning models are often poorly ...
research
03/21/2017

Overcoming model simplifications when quantifying predictive uncertainty

It is generally accepted that all models are wrong -- the difficulty is ...

Please sign up or login with your details

Forgot password? Click here to reset