Fast Training of Effective Multi-class Boosting Using Coordinate Descent Optimization

11/23/2013
by   Guosheng Lin, et al.
0

Wepresentanovelcolumngenerationbasedboostingmethod for multi-class classification. Our multi-class boosting is formulated in a single optimization problem as in Shen and Hao (2011). Different from most existing multi-class boosting methods, which use the same set of weak learners for all the classes, we train class specified weak learners (i.e., each class has a different set of weak learners). We show that using separate weak learner sets for each class leads to fast convergence, without introducing additional computational overhead in the training procedure. To further make the training more efficient and scalable, we also propose a fast co- ordinate descent method for solving the optimization problem at each boosting iteration. The proposed coordinate descent method is conceptually simple and easy to implement in that it is a closed-form solution for each coordinate update. Experimental results on a variety of datasets show that, compared to a range of existing multi-class boosting meth- ods, the proposed method has much faster convergence rate and better generalization performance in most cases. We also empirically show that the proposed fast coordinate descent algorithm needs less training time than the MultiBoost algorithm in Shen and Hao (2011).

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/12/2016

Improved Multi-Class Cost-Sensitive Boosting via Estimation of the Minimum-Risk Class

We present a simple unified framework for multi-class cost-sensitive boo...
research
06/30/2016

Multi-class classification: mirror descent approach

We consider the problem of multi-class classification and a stochastic o...
research
01/26/2021

Iterative Weak Learnability and Multi-Class AdaBoost

We construct an efficient recursive ensemble algorithm for the multi-cla...
research
10/24/2018

Randomized Gradient Boosting Machine

Gradient Boosting Machine (GBM) introduced by Friedman is an extremely p...
research
04/01/2020

Fully-Corrective Gradient Boosting with Squared Hinge: Fast Learning Rates and Early Stopping

Boosting is a well-known method for improving the accuracy of weak learn...
research
06/17/2020

MixBoost: A Heterogeneous Boosting Machine

Modern gradient boosting software frameworks, such as XGBoost and LightG...
research
10/18/2011

AOSO-LogitBoost: Adaptive One-Vs-One LogitBoost for Multi-Class Problem

This paper presents an improvement to model learning when using multi-cl...

Please sign up or login with your details

Forgot password? Click here to reset