MBCT: Tree-Based Feature-Aware Binning for Individual Uncertainty Calibration

02/09/2022
by   Siguang Huang, et al.
0

Most machine learning classifiers only concern classification accuracy, while certain applications (such as medical diagnosis, meteorological forecasting, and computation advertising) require the model to predict the true probability, known as a calibrated estimate. In previous work, researchers have developed several calibration methods to post-process the outputs of a predictor to obtain calibrated values, such as binning and scaling methods. Compared with scaling, binning methods are shown to have distribution-free theoretical guarantees, which motivates us to prefer binning methods for calibration. However, we notice that existing binning methods have several drawbacks: (a) the binning scheme only considers the original prediction values, thus limiting the calibration performance; and (b) the binning approach is non-individual, mapping multiple samples in a bin to the same value, and thus is not suitable for order-sensitive applications. In this paper, we propose a feature-aware binning framework, called Multiple Boosting Calibration Trees (MBCT), along with a multi-view calibration loss to tackle the above issues. Our MBCT optimizes the binning scheme by the tree structures of features, and adopts a linear function in a tree node to achieve individual calibration. Our MBCT is non-monotonic, and has the potential to improve order accuracy, due to its learnable binning scheme and the individual calibration. We conduct comprehensive experiments on three datasets in different fields. Results show that our method outperforms all competing models in terms of both calibration error and order accuracy. We also conduct simulation experiments, justifying that the proposed multi-view calibration loss is a better metric in modeling calibration error.

READ FULL TEXT
research
09/23/2019

Verified Uncertainty Calibration

Applications such as weather forecasting and personalized medicine deman...
research
02/15/2022

Taking a Step Back with KCal: Multi-Class Kernel-Based Calibration for Deep Neural Networks

Deep neural network (DNN) classifiers are often overconfident, producing...
research
07/31/2018

Probability Calibration Trees

Obtaining accurate and well calibrated probability estimates from classi...
research
06/18/2020

Individual Calibration with Randomized Forecasting

Machine learning applications often require calibrated predictions, e.g....
research
06/20/2013

Failure of Calibration is Typical

Schervish (1985b) showed that every forecasting system is noncalibrated ...
research
03/10/2023

Machine learning for sports betting: should forecasting models be optimised for accuracy or calibration?

Sports betting's recent federal legalisation in the USA coincides with t...
research
12/19/2022

Discrimination, calibration, and point estimate accuracy of GRU-D-Weibull architecture for real-time individualized endpoint prediction

Real-time individual endpoint prediction has always been a challenging t...

Please sign up or login with your details

Forgot password? Click here to reset