DeepAI AI Chat
Log In Sign Up

Package for Fast ABC-Boost

by   Ping Li, et al.

This report presents the open-source package which implements the series of our boosting works in the past years. In particular, the package includes mainly three lines of techniques, among which the following two are already the standard implementations in popular boosted tree platforms: (i) The histogram-based (feature-binning) approach makes the tree implementation convenient and efficient. In Li et al (2007), a simple fixed-length adaptive binning algorithm was developed. In this report, we demonstrate that such a simple algorithm is still surprisingly effective compared to more sophisticated variants in popular tree platforms. (ii) The explicit gain formula in Li (20010) for tree splitting based on second-order derivatives of the loss function typically improves, often considerably, over the first-order methods. Although the gain formula in Li (2010) was derived for logistic regression loss, it is a generic formula for loss functions with second-derivatives. For example, the open-source package also includes L_p regression for p≥ 1. The main contribution of this package is the ABC-Boost (adaptive base class boosting) for multi-class classification. The initial work in Li (2008) derived a new set of derivatives of the classical multi-class logistic regression by specifying a "base class". The accuracy can be substantially improved if the base class is chosen properly. The major technical challenge is to design a search strategy to select the base class. The prior published works implemented an exhaustive search procedure to find the base class which is computationally too expensive. Recently, a new report (Li and Zhao, 20022) presents a unified framework of "Fast ABC-Boost" which allows users to efficiently choose the proper search space for the base class. The package provides interfaces for linux, windows, mac, matlab, R, python.


page 1

page 2

page 3

page 4


Fast ABC-Boost: A Unified Framework for Selecting the Base Class in Multi-Class Classification

The work in ICML'09 showed that the derivatives of the classical multi-c...

Robust LogitBoost and Adaptive Base Class (ABC) LogitBoost

Logitboost is an influential boosting algorithm for classification. In t...

Several Tunable GMM Kernels

While tree methods have been popular in practice, researchers and practi...

Condensed Gradient Boosting

This paper presents a computationally efficient variant of gradient boos...

pGMM Kernel Regression and Comparisons with Boosted Trees

In this work, we demonstrate the advantage of the pGMM (“powered general...

Compact Multi-Class Boosted Trees

Gradient boosted decision trees are a popular machine learning technique...

AOSO-LogitBoost: Adaptive One-Vs-One LogitBoost for Multi-Class Problem

This paper presents an improvement to model learning when using multi-cl...