DeepAI AI Chat
Log In Sign Up

Distribution Calibration for Regression

by   Hao Song, et al.

We are concerned with obtaining well-calibrated output distributions from regression models. Such distributions allow us to quantify the uncertainty that the model has regarding the predicted target value. We introduce the novel concept of distribution calibration, and demonstrate its advantages over the existing definition of quantile calibration. We further propose a post-hoc approach to improving the predictions from previously trained regression models, using multi-output Gaussian Processes with a novel Beta link function. The proposed method is experimentally verified on a set of common regression models and shows improvements for both distribution-level and quantile-level calibration.


Quantile Regularization: Towards Implicit Calibration of Regression Models

Recent works have shown that most deep learning models are often poorly ...

Parametric and Multivariate Uncertainty Calibration for Regression and Object Detection

Reliable spatial uncertainty evaluation of object detection models is of...

Distribution-Free Model-Agnostic Regression Calibration via Nonparametric Methods

In this paper, we consider the uncertainty quantification problem for re...

Probabilistic Models for Manufacturing Lead Times

In this study, we utilize Gaussian processes, probabilistic neural netwo...

Quantile-based hydrological modelling

Predictive uncertainty in hydrological modelling is quantified by using ...

Posterior Annealing: Fast Calibrated Uncertainty for Regression

Bayesian deep learning approaches that allow uncertainty estimation for ...

Quantile Encoder: Tackling High Cardinality Categorical Features in Regression Problems

Regression problems have been widely studied in machinelearning literatu...