Online GentleAdaBoost – Technical Report

08/27/2023
by   Chapman Siu, et al.
0

We study the online variant of GentleAdaboost, where we combine a weak learner to a strong learner in an online fashion. We provide an approach to extend the batch approach to an online approach with theoretical justifications through application of line search. Finally we compare our online boosting approach with other online approaches across a variety of benchmark datasets.

READ FULL TEXT

page 1

page 2

page 3

research
06/27/2012

An Online Boosting Algorithm with Theoretical Justifications

We study the task of online boosting--combining online weak learners int...
research
08/06/2023

Self-Directed Linear Classification

In online classification, a learner is presented with a sequence of exam...
research
03/02/2020

Online Agnostic Boosting via Regret Minimization

Boosting is a widely used machine learning approach based on the idea of...
research
10/24/2008

Online Coordinate Boosting

We present a new online boosting algorithm for adapting the weights of a...
research
01/27/2023

AdaBoost is not an Optimal Weak to Strong Learner

AdaBoost is a classic boosting algorithm for combining multiple inaccura...
research
10/24/2019

Online Boosting for Multilabel Ranking with Top-k Feedback

We present online boosting algorithms for multilabel ranking with top-k ...
research
09/29/2021

Apple Tasting Revisited: Bayesian Approaches to Partially Monitored Online Binary Classification

We consider a variant of online binary classification where a learner se...

Please sign up or login with your details

Forgot password? Click here to reset