Online Coordinate Boosting

10/24/2008
by   Raphael Pelossof, et al.
0

We present a new online boosting algorithm for adapting the weights of a boosted classifier, which yields a closer approximation to Freund and Schapire's AdaBoost algorithm than previous online boosting algorithms. We also contribute a new way of deriving the online algorithm that ties together previous online boosting work. We assume that the weak hypotheses were selected beforehand, and only their weights are updated during online boosting. The update rule is derived by minimizing AdaBoost's loss when viewed in an incremental form. The equations show that optimization is computationally expensive. However, a fast online approximation is possible. We compare approximation error to batch AdaBoost on synthetic datasets and generalization error on face datasets and the MNIST dataset.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/27/2012

An Online Boosting Algorithm with Theoretical Justifications

We study the task of online boosting--combining online weak learners int...
research
02/23/2017

Online Multiclass Boosting

Recent work has extended the theoretical analysis of boosting algorithms...
research
08/27/2023

Online GentleAdaBoost – Technical Report

We study the online variant of GentleAdaboost, where we combine a weak l...
research
05/30/2022

Online Agnostic Multiclass Boosting

Boosting is a fundamental approach in machine learning that enjoys both ...
research
03/02/2020

Online Agnostic Boosting via Regret Minimization

Boosting is a widely used machine learning approach based on the idea of...
research
09/16/2010

Asymmetric Totally-corrective Boosting for Real-time Object Detection

Real-time object detection is one of the core problems in computer visio...
research
01/16/2020

Better Boosting with Bandits for Online Learning

Probability estimates generated by boosting ensembles are poorly calibra...

Please sign up or login with your details

Forgot password? Click here to reset