Generalized Boosting Algorithms for Convex Optimization

05/10/2011
by   Alexander Grubb, et al.
0

Boosting is a popular way to derive powerful learners from simpler hypothesis classes. Following previous work (Mason et al., 1999; Friedman, 2000) on general boosting frameworks, we analyze gradient-based descent algorithms for boosting with respect to any convex objective and introduce a new measure of weak learner performance into this setting which generalizes existing work. We present the weak to strong learning guarantees for the existing gradient boosting work for strongly-smooth, strongly-convex objectives under this new measure of performance, and also demonstrate that this work fails for non-smooth objectives. To address this issue, we present new algorithms which extend this boosting approach to arbitrary convex loss functions and give corresponding weak to strong convergence results. In addition, we demonstrate experimental results that support our analysis and demonstrate the need for the new algorithms we present.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/30/2022

Online Agnostic Multiclass Boosting

Boosting is a fundamental approach in machine learning that enjoys both ...
research
10/05/2015

Boosting in the presence of outliers: adaptive classification with non-convex loss functions

This paper examines the role and efficiency of the non-convex loss funct...
research
07/04/2013

AdaBoost and Forward Stagewise Regression are First-Order Convex Optimization Methods

Boosting methods are highly popular and effective supervised learning me...
research
10/11/2018

Online Multiclass Boosting with Bandit Feedback

We present online boosting algorithms for multiclass classification with...
research
03/02/2020

Online Agnostic Boosting via Regret Minimization

Boosting is a widely used machine learning approach based on the idea of...
research
08/30/2010

Totally Corrective Boosting for Regularized Risk Minimization

Consideration of the primal and dual problems together leads to importan...
research
06/24/2022

Symbolic-Regression Boosting

Modifying standard gradient boosting by replacing the embedded weak lear...

Please sign up or login with your details

Forgot password? Click here to reset