Multicalibration as Boosting for Regression

01/31/2023
by   Ira Globus-Harris, et al.
0

We study the connection between multicalibration and boosting for squared error regression. First we prove a useful characterization of multicalibration in terms of a “swap regret” like condition on squared error. Using this characterization, we give an exceedingly simple algorithm that can be analyzed both as a boosting algorithm for regression and as a multicalibration algorithm for a class H that makes use only of a standard squared error regression oracle for H. We give a weak learning assumption on H that ensures convergence to Bayes optimality without the need to make any realizability assumptions – giving us an agnostic boosting algorithm for regression. We then show that our weak learning assumption on H is both necessary and sufficient for multicalibration with respect to H to imply Bayes optimality. We also show that if H satisfies our weak learning condition relative to another class C then multicalibration with respect to H implies multicalibration with respect to C. Finally we investigate the empirical performance of our algorithm experimentally using an open source implementation that we make available. Our code repository can be found at https://github.com/Declancharrison/Level-Set-Boosting.

READ FULL TEXT

page 24

page 25

page 26

research
07/02/2023

Multiclass Boosting: Simple and Intuitive Weak Learning Criteria

We study a generalization of boosting to the multiclass setting. We intr...
research
11/13/2014

SelfieBoost: A Boosting Algorithm for Deep Learning

We describe and analyze a new boosting algorithm for deep learning calle...
research
07/19/2022

Consistency of the Bayes Estimator of a Regression Curve

Strong consistency of the Bayes estimator of a regression curve for the ...
research
03/24/2020

Efficient Algorithms for Multidimensional Segmented Regression

We study the fundamental problem of fixed design multidimensional segme...
research
01/26/2021

Iterative Weak Learnability and Multi-Class AdaBoost

We construct an efficient recursive ensemble algorithm for the multi-cla...
research
08/01/2022

Boosted Off-Policy Learning

We investigate boosted ensemble models for off-policy learning from logg...
research
12/28/2021

Admissibility is Bayes optimality with infinitesimals

We give an exact characterization of admissibility in statistical decisi...

Please sign up or login with your details

Forgot password? Click here to reset