Online Multiclass Boosting

02/23/2017
by   Young Hun Jung, et al.
0

Recent work has extended the theoretical analysis of boosting algorithms to multiclass problems and to online settings. However, the multiclass extension is in the batch setting and the online extensions only consider binary classification. We fill this gap in the literature by defining, and justifying, a weak learning condition for online multiclass boosting. This condition leads to an optimal boosting algorithm that requires the minimal number of weak learners to achieve certain accuracy. Additionally we propose an adaptive algorithm which is near optimal and enjoys excellent performance in real data due to its adaptive property.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/30/2022

Online Agnostic Multiclass Boosting

Boosting is a fundamental approach in machine learning that enjoys both ...
research
08/15/2011

A theory of multiclass boosting

Boosting combines weak classifiers to form highly accurate predictors. A...
research
06/20/2019

Boosting for Dynamical Systems

We propose a framework of boosting for learning and control in environme...
research
01/14/2014

A Boosting Approach to Learning Graph Representations

Learning the right graph representation from noisy, multisource data has...
research
10/24/2008

Online Coordinate Boosting

We present a new online boosting algorithm for adapting the weights of a...
research
01/26/2021

Iterative Weak Learnability and Multi-Class AdaBoost

We construct an efficient recursive ensemble algorithm for the multi-cla...
research
04/01/2020

Fully-Corrective Gradient Boosting with Squared Hinge: Fast Learning Rates and Early Stopping

Boosting is a well-known method for improving the accuracy of weak learn...

Please sign up or login with your details

Forgot password? Click here to reset