AdaBoost and Forward Stagewise Regression are First-Order Convex Optimization Methods

07/04/2013
by   Robert M. Freund, et al.
0

Boosting methods are highly popular and effective supervised learning methods which combine weak learners into a single accurate model with good statistical performance. In this paper, we analyze two well-known boosting methods, AdaBoost and Incremental Forward Stagewise Regression (FS_ε), by establishing their precise connections to the Mirror Descent algorithm, which is a first-order method in convex optimization. As a consequence of these connections we obtain novel computational guarantees for these boosting methods. In particular, we characterize convergence bounds of AdaBoost, related to both the margin and log-exponential loss function, for any step-size sequence. Furthermore, this paper presents, for the first time, precise computational complexity results for FS_ε.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/16/2015

A New Perspective on Boosting in Linear Regression via Subgradient Optimization and Relatives

In this paper we analyze boosting algorithms in linear regression from a...
research
05/10/2011

Generalized Boosting Algorithms for Convex Optimization

Boosting is a popular way to derive powerful learners from simpler hypot...
research
01/19/2021

Unified Robust Boosting

Boosting is a popular machine learning algorithm in regression and class...
research
03/18/2013

Margins, Shrinkage, and Boosting

This manuscript shows that AdaBoost and its immediate variants can produ...
research
10/24/2018

Randomized Gradient Boosting Machine

Gradient Boosting Machine (GBM) introduced by Friedman is an extremely p...
research
02/05/2020

A Precise High-Dimensional Asymptotic Theory for Boosting and Min-L1-Norm Interpolated Classifiers

This paper establishes a precise high-dimensional asymptotic theory for ...
research
09/13/2021

Minimizing Quantum Renyi Divergences via Mirror Descent with Polyak Step Size

Quantum information quantities play a substantial role in characterizing...

Please sign up or login with your details

Forgot password? Click here to reset