Logarithmic Time One-Against-Some

06/15/2016
by   Hal Daumé III, et al.
0

We create a new online reduction of multiclass classification to binary classification for which training and prediction time scale logarithmically with the number of classes. Compared to previous approaches, we obtain substantially better statistical performance for two reasons: First, we prove a tighter and more complete boosting theorem, and second we translate the results more directly into an algorithm. We show that several simple techniques give rise to an algorithm that can compete with one-against-all in both space and predictive power while offering exponential improvements in speed when the number of classes is large.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/30/2022

Online Agnostic Multiclass Boosting

Boosting is a fundamental approach in machine learning that enjoys both ...
research
01/23/2017

Aggressive Sampling for Multi-class to Binary Reduction with Applications to Text Classification

We address the problem of multi-class classification in the case where t...
research
07/02/2023

Multiclass Boosting: Simple and Intuitive Weak Learning Criteria

We study a generalization of boosting to the multiclass setting. We intr...
research
09/14/2021

Predicting Loss Risks for B2B Tendering Processes

Sellers and executives who maintain a bidding pipeline of sales engageme...
research
06/03/2022

Generalization for multiclass classification with overparameterized linear models

Via an overparameterized linear model with Gaussian features, we provide...
research
02/09/2018

ATPboost: Learning Premise Selection in Binary Setting with ATP Feedback

ATPboost is a system for solving sets of large-theory problems by interl...

Please sign up or login with your details

Forgot password? Click here to reset