A theory of multiclass boosting

08/15/2011
by   Indraneel Mukherjee, et al.
0

Boosting combines weak classifiers to form highly accurate predictors. Although the case of binary classification is well understood, in the multiclass setting, the "correct" requirements on the weak classifier, or the notion of the most efficient boosting algorithms are missing. In this paper, we create a broad and general framework, within which we make precise and identify the optimal requirements on the weak-classifier, as well as design the most effective, in a certain sense, boosting algorithms that assume such requirements.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/23/2017

Online Multiclass Boosting

Recent work has extended the theoretical analysis of boosting algorithms...
research
08/02/2023

When Analytic Calculus Cracks AdaBoost Code

The principle of boosting in supervised learning involves combining mult...
research
06/09/2011

Inducing Interpretable Voting Classifiers without Trading Accuracy for Simplicity: Theoretical Results, Approximation Algorithms

Recent advances in the study of voting classification algorithms have br...
research
05/19/2022

A Boosting Algorithm for Positive-Unlabeled Learning

Positive-unlabeled (PU) learning deals with binary classification proble...
research
05/24/2022

Boosting Tail Neural Network for Realtime Custom Keyword Spotting

In this paper, we propose a Boosting Tail Neural Network (BTNN) for impr...
research
09/10/2019

Boosting Classifiers with Noisy Inference

We present a principled framework to address resource allocation for rea...
research
10/27/2018

Handling Imbalanced Dataset in Multi-label Text Categorization using Bagging and Adaptive Boosting

Imbalanced dataset is occurred due to uneven distribution of data availa...

Please sign up or login with your details

Forgot password? Click here to reset