Iterative Weak Learnability and Multi-Class AdaBoost

01/26/2021
by   In-Koo Cho, et al.
0

We construct an efficient recursive ensemble algorithm for the multi-class classification problem, inspired by SAMME (Zhu, Zou, Rosset, and Hastie (2009)). We strengthen the weak learnability condition in Zhu, Zou, Rosset, and Hastie (2009) by requiring that the weak learnability condition holds for any subset of labels with at least two elements. This condition is simpler to check than many proposed alternatives (e.g., Mukherjee and Schapire (2013)). As SAMME, our algorithm is reduced to the Adaptive Boosting algorithm (Schapire and Freund (2012)) if the number of labels is two, and can be motivated as a functional version of the steepest descending method to find an optimal solution. In contrast to SAMME, our algorithm's final hypothesis converges to the correct label with probability 1. For any number of labels, the probability of misclassification vanishes exponentially as the training period increases. The sum of the training error and an additional term, that depends only on the sample size, bounds the generalization error of our algorithm as the Adaptive Boosting algorithm.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/23/2013

Fast Training of Effective Multi-class Boosting Using Coordinate Descent Optimization

Wepresentanovelcolumngenerationbasedboostingmethod for multi-class class...
research
02/23/2017

Online Multiclass Boosting

Recent work has extended the theoretical analysis of boosting algorithms...
research
07/12/2016

Improved Multi-Class Cost-Sensitive Boosting via Estimation of the Minimum-Risk Class

We present a simple unified framework for multi-class cost-sensitive boo...
research
05/24/2019

A Generalization Error Bound for Multi-class Domain Generalization

Domain generalization is the problem of assigning labels to an unlabeled...
research
02/22/2020

Optimistic bounds for multi-output prediction

We investigate the challenge of multi-output learning, where the goal is...
research
01/31/2023

Multicalibration as Boosting for Regression

We study the connection between multicalibration and boosting for square...
research
02/03/2014

Transductive Learning with Multi-class Volume Approximation

Given a hypothesis space, the large volume principle by Vladimir Vapnik ...

Please sign up or login with your details

Forgot password? Click here to reset