Efficient, Anytime Algorithms for Calibration with Isotonic Regression under Strictly Convex Losses

10/31/2021
by   Kaan Gokcesu, et al.
0

We investigate the calibration of estimations to increase performance with an optimal monotone transform on the estimator outputs. We start by studying the traditional square error setting with its weighted variant and show that the optimal monotone transform is in the form of a unique staircase function. We further show that this staircase behavior is preserved for general strictly convex loss functions. Their optimal monotone transforms are also unique, i.e., there exist a single staircase transform that achieves the minimum loss. We propose a linear time and space algorithm that can find such optimal transforms for specific loss settings. Our algorithm has an online implementation where the optimal transform for the samples observed so far are found in linear space and amortized time when the samples arrive in an ordered fashion. We also extend our results to cases where the functions are not trivial to individually optimize and propose an anytime algorithm, which has linear space and pseudo-linearithmic time complexity.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/01/2022

A Log-Linear Time Sequential Optimal Calibration Algorithm for Quantized Isotonic L2 Regression

We study the sequential calibration of estimations in a quantized isoton...
research
02/22/2022

Nonconvex Extension of Generalized Huber Loss for Robust Learning and Pseudo-Mode Statistics

We propose an extended generalization of the pseudo Huber loss formulati...
research
04/04/2023

Sequential Linearithmic Time Optimal Unimodal Fitting When Minimizing Univariate Linear Losses

This paper focuses on optimal unimodal transformation of the score outpu...
research
08/19/2021

Optimally Efficient Sequential Calibration of Binary Classifiers to Minimize Classification Error

In this work, we aim to calibrate the score outputs of an estimator for ...
research
10/07/2010

Optimizing Monotone Functions Can Be Difficult

Extending previous analyses on function classes like linear functions, w...
research
08/13/2021

Optimal and Efficient Algorithms for General Mixable Losses against Switching Oracles

We investigate the problem of online learning, which has gained signific...
research
01/11/2023

Loss-Controlling Calibration for Predictive Models

We propose a learning framework for calibrating predictive models to mak...

Please sign up or login with your details

Forgot password? Click here to reset