AOSO-LogitBoost: Adaptive One-Vs-One LogitBoost for Multi-Class Problem

10/18/2011
by   Peng Sun, et al.
0

This paper presents an improvement to model learning when using multi-class LogitBoost for classification. Motivated by the statistical view, LogitBoost can be seen as additive tree regression. Two important factors in this setting are: 1) coupled classifier output due to a sum-to-zero constraint, and 2) the dense Hessian matrices that arise when computing tree node split gain and node value fittings. In general, this setting is too complicated for a tractable model learning algorithm. However, too aggressive simplification of the setting may lead to degraded performance. For example, the original LogitBoost is outperformed by ABC-LogitBoost due to the latter's more careful treatment of the above two factors. In this paper we propose techniques to address the two main difficulties of the LogitBoost setting: 1) we adopt a vector tree (i.e. each node value is vector) that enforces a sum-to-zero constraint, and 2) we use an adaptive block coordinate descent that exploits the dense Hessian when computing tree split gain and node values. Higher classification accuracy and faster convergence rates are observed for a range of public data sets when compared to both the original and the ABC-LogitBoost implementations.

READ FULL TEXT
research
06/30/2016

Multi-class classification: mirror descent approach

We consider the problem of multi-class classification and a stochastic o...
research
03/15/2012

Robust LogitBoost and Adaptive Base Class (ABC) LogitBoost

Logitboost is an influential boosting algorithm for classification. In t...
research
11/23/2013

Fast Training of Effective Multi-class Boosting Using Coordinate Descent Optimization

Wepresentanovelcolumngenerationbasedboostingmethod for multi-class class...
research
12/12/2019

Adaptive Reticulum

Neural Networks and Random Forests: two popular techniques for supervise...
research
09/01/2022

CPS Attack Detection under Limited Local Information in Cyber Security: A Multi-node Multi-class Classification Ensemble Approach

Cybersecurity breaches are the common anomalies for distributed cyber-ph...
research
07/18/2022

Package for Fast ABC-Boost

This report presents the open-source package which implements the series...
research
08/23/2022

Regularized impurity reduction: Accurate decision trees with complexity guarantees

Decision trees are popular classification models, providing high accurac...

Please sign up or login with your details

Forgot password? Click here to reset