Boosting with the Logistic Loss is Consistent

05/13/2013
by   Matus Telgarsky, et al.
0

This manuscript provides optimization guarantees, generalization bounds, and statistical consistency results for AdaBoost variants which replace the exponential loss with the logistic and similar losses (specifically, twice differentiable convex losses which are Lipschitz and tend to zero on one side). The heart of the analysis is to show that, in lieu of explicit regularization and constraints, the structure of the problem is fairly rigidly controlled by the source distribution itself. The first control of this type is in the separable case, where a distribution-dependent relaxed weak learning rate induces speedy convergence with high probability over any sample. Otherwise, in the nonseparable case, the convex surrogate risk itself exhibits distribution-dependent levels of curvature, and consequently the algorithm's output has small norm with high probability.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/22/2021

Stability and Deviation Optimal Risk Bounds with Convergence Rate O(1/n)

The sharpest known high probability generalization bounds for uniformly ...
research
02/14/2022

Stochastic linear optimization never overfits with quadratically-bounded losses on general data

This work shows that a diverse collection of linear optimization methods...
research
10/11/2011

The Generalization Ability of Online Algorithms for Dependent Data

We study the generalization performance of online learning algorithms tr...
research
05/19/2022

What killed the Convex Booster ?

A landmark negative result of Long and Servedio established a worst-case...
research
06/27/2012

Consistent Multilabel Ranking through Univariate Losses

We consider the problem of rank loss minimization in the setting of mult...
research
05/16/2022

ℋ-Consistency Estimation Error of Surrogate Loss Minimizers

We present a detailed study of estimation errors in terms of surrogate l...
research
04/04/2014

Optimal learning with Bernstein Online Aggregation

We introduce a new recursive aggregation procedure called Bernstein Onli...

Please sign up or login with your details

Forgot password? Click here to reset