Package for Fast ABC-Boost

07/18/2022
by   Ping Li, et al.
2

This report presents the open-source package which implements the series of our boosting works in the past years. In particular, the package includes mainly three lines of techniques, among which the following two are already the standard implementations in popular boosted tree platforms: (i) The histogram-based (feature-binning) approach makes the tree implementation convenient and efficient. In Li et al (2007), a simple fixed-length adaptive binning algorithm was developed. In this report, we demonstrate that such a simple algorithm is still surprisingly effective compared to more sophisticated variants in popular tree platforms. (ii) The explicit gain formula in Li (20010) for tree splitting based on second-order derivatives of the loss function typically improves, often considerably, over the first-order methods. Although the gain formula in Li (2010) was derived for logistic regression loss, it is a generic formula for loss functions with second-derivatives. For example, the open-source package also includes L_p regression for p≥ 1. The main contribution of this package is the ABC-Boost (adaptive base class boosting) for multi-class classification. The initial work in Li (2008) derived a new set of derivatives of the classical multi-class logistic regression by specifying a "base class". The accuracy can be substantially improved if the base class is chosen properly. The major technical challenge is to design a search strategy to select the base class. The prior published works implemented an exhaustive search procedure to find the base class which is computationally too expensive. Recently, a new report (Li and Zhao, 20022) presents a unified framework of "Fast ABC-Boost" which allows users to efficiently choose the proper search space for the base class. The package provides interfaces for linux, windows, mac, matlab, R, python.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/22/2022

Fast ABC-Boost: A Unified Framework for Selecting the Base Class in Multi-Class Classification

The work in ICML'09 showed that the derivatives of the classical multi-c...
research
03/15/2012

Robust LogitBoost and Adaptive Base Class (ABC) LogitBoost

Logitboost is an influential boosting algorithm for classification. In t...
research
05/08/2018

Several Tunable GMM Kernels

While tree methods have been popular in practice, researchers and practi...
research
11/26/2022

Condensed Gradient Boosting

This paper presents a computationally efficient variant of gradient boos...
research
07/18/2022

pGMM Kernel Regression and Comparisons with Boosted Trees

In this work, we demonstrate the advantage of the pGMM (“powered general...
research
10/31/2017

Compact Multi-Class Boosted Trees

Gradient boosted decision trees are a popular machine learning technique...
research
10/18/2011

AOSO-LogitBoost: Adaptive One-Vs-One LogitBoost for Multi-Class Problem

This paper presents an improvement to model learning when using multi-cl...

Please sign up or login with your details

Forgot password? Click here to reset