Online Platt Scaling with Calibeating

04/28/2023
by   Chirag Gupta, et al.
0

We present an online post-hoc calibration method, called Online Platt Scaling (OPS), which combines the Platt scaling technique with online logistic regression. We demonstrate that OPS smoothly adapts between i.i.d. and non-i.i.d. settings with distribution drift. Further, in scenarios where the best Platt scaling model is itself miscalibrated, we enhance OPS by incorporating a recently developed technique called calibeating to make it more robust. Theoretically, our resulting OPS+calibeating method is guaranteed to be calibrated for adversarial outcome sequences. Empirically, it is effective on a range of synthetic and real-world datasets, with and without distribution drifts, achieving superior performance without hyperparameter tuning. Finally, we extend all OPS ideas to the beta scaling method.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/29/2022

Bayesian Neural Network Versus Ex-Post Calibration For Prediction Uncertainty

Probabilistic predictions from neural networks which account for predict...
research
04/16/2016

DS-MLR: Exploiting Double Separability for Scaling up Distributed Multinomial Logistic Regression

Scaling multinomial logistic regression to datasets with very large numb...
research
02/05/2019

Robust Regression via Online Feature Selection under Adversarial Data Corruption

The presence of data corruption in user-generated streaming data, such a...
research
09/23/2019

Verified Uncertainty Calibration

Applications such as weather forecasting and personalized medicine deman...
research
07/28/2014

Dynamic Feature Scaling for Online Learning of Binary Classifiers

Scaling feature values is an important step in numerous machine learning...
research
06/20/2020

Asymptotically Optimal Exact Minibatch Metropolis-Hastings

Metropolis-Hastings (MH) is a commonly-used MCMC algorithm, but it can b...
research
03/04/2023

ESD: Expected Squared Difference as a Tuning-Free Trainable Calibration Measure

Studies have shown that modern neural networks tend to be poorly calibra...

Please sign up or login with your details

Forgot password? Click here to reset