Fast ABC-Boost: A Unified Framework for Selecting the Base Class in Multi-Class Classification

05/22/2022
by   Ping Li, et al.
0

The work in ICML'09 showed that the derivatives of the classical multi-class logistic regression loss function could be re-written in terms of a pre-chosen "base class" and applied the new derivatives in the popular boosting framework. In order to make use of the new derivatives, one must have a strategy to identify/choose the base class at each boosting iteration. The idea of "adaptive base class boost" (ABC-Boost) in ICML'09, adopted a computationally expensive "exhaustive search" strategy for the base class at each iteration. It has been well demonstrated that ABC-Boost, when integrated with trees, can achieve substantial improvements in many multi-class classification tasks. Furthermore, the work in UAI'10 derived the explicit second-order tree split gain formula which typically improved the classification accuracy considerably, compared with using only the fist-order information for tree-splitting, for both multi-class and binary-class classification tasks. In this paper, we develop a unified framework for effectively selecting the base class by introducing a series of ideas to improve the computational efficiency of ABC-Boost. Our framework has parameters (s,g,w). At each boosting iteration, we only search for the "s-worst classes" (instead of all classes) to determine the base class. We also allow a "gap" g when conducting the search. That is, we only search for the base class at every g+1 iterations. We furthermore allow a "warm up" stage by only starting the search after w boosting iterations. The parameters s, g, w, can be viewed as tunable parameters and certain combinations of (s,g,w) may even lead to better test accuracy than the "exhaustive search" strategy. Overall, our proposed framework provides a robust and reliable scheme for implementing ABC-Boost in practice.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/18/2022

Package for Fast ABC-Boost

This report presents the open-source package which implements the series...
research
11/26/2022

Condensed Gradient Boosting

This paper presents a computationally efficient variant of gradient boos...
research
11/14/2020

MP-Boost: Minipatch Boosting via Adaptive Feature and Observation Sampling

Boosting methods are among the best general-purpose and off-the-shelf ma...
research
03/15/2012

Robust LogitBoost and Adaptive Base Class (ABC) LogitBoost

Logitboost is an influential boosting algorithm for classification. In t...
research
07/12/2016

Improved Multi-Class Cost-Sensitive Boosting via Estimation of the Minimum-Risk Class

We present a simple unified framework for multi-class cost-sensitive boo...
research
06/17/2020

MixBoost: A Heterogeneous Boosting Machine

Modern gradient boosting software frameworks, such as XGBoost and LightG...
research
08/28/2009

ABC-LogitBoost for Multi-class Classification

We develop abc-logitboost, based on the prior work on abc-boost and robu...

Please sign up or login with your details

Forgot password? Click here to reset