Improved Multi-Class Cost-Sensitive Boosting via Estimation of the Minimum-Risk Class

07/12/2016
by   Ron Appel, et al.
0

We present a simple unified framework for multi-class cost-sensitive boosting. The minimum-risk class is estimated directly, rather than via an approximation of the posterior distribution. Our method jointly optimizes binary weak learners and their corresponding output vectors, requiring classes to share features at each iteration. By training in a cost-sensitive manner, weak learners are invested in separating classes whose discrimination is important, at the expense of less relevant classification boundaries. Additional contributions are a family of loss functions along with proof that our algorithm is Boostable in the theoretical sense, as well as an efficient procedure for growing decision trees for use as weak learners. We evaluate our method on a variety of datasets: a collection of synthetic planar data, common UCI datasets, MNIST digits, SUN scenes, and CUB-200 birds. Results show state-of-the-art performance across all datasets against several strong baselines, including non-boosting multi-class approaches.

READ FULL TEXT

page 1

page 17

page 18

page 19

research
11/23/2013

Fast Training of Effective Multi-class Boosting Using Coordinate Descent Optimization

Wepresentanovelcolumngenerationbasedboostingmethod for multi-class class...
research
07/08/2015

Double-Base Asymmetric AdaBoost

Based on the use of different exponential bases to define class-dependen...
research
02/19/2020

Gradient Boosting Neural Networks: GrowNet

A novel gradient boosting framework is proposed where shallow neural net...
research
01/26/2021

Iterative Weak Learnability and Multi-Class AdaBoost

We construct an efficient recursive ensemble algorithm for the multi-cla...
research
03/15/2012

Robust LogitBoost and Adaptive Base Class (ABC) LogitBoost

Logitboost is an influential boosting algorithm for classification. In t...
research
05/22/2022

Fast ABC-Boost: A Unified Framework for Selecting the Base Class in Multi-Class Classification

The work in ICML'09 showed that the derivatives of the classical multi-c...
research
01/07/2010

An Empirical Evaluation of Four Algorithms for Multi-Class Classification: Mart, ABC-Mart, Robust LogitBoost, and ABC-LogitBoost

This empirical study is mainly devoted to comparing four tree-based boos...

Please sign up or login with your details

Forgot password? Click here to reset