Top-k Multiclass SVM

11/20/2015
by   Maksim Lapin, et al.
0

Class ambiguity is typical in image classification problems with a large number of classes. When classes are difficult to discriminate, it makes sense to allow k guesses and evaluate classifiers based on the top-k error instead of the standard zero-one loss. We propose top-k multiclass SVM as a direct method to optimize for top-k performance. Our generalization of the well-known multiclass SVM is based on a tight convex upper bound of the top-k error. We propose a fast optimization scheme based on an efficient projection onto the top-k simplex, which is of its own interest. Experiments on five datasets show consistent improvements in top-k accuracy compared to various baselines.

READ FULL TEXT
research
02/26/2020

Nonlinear classifiers for ranking problems based on kernelized SVM

Many classification problems focus on maximizing the performance only on...
research
04/16/2020

Nonparallel Hyperplane Classifiers for Multi-category Classification

Support vector machines (SVMs) are widely used for solving classificatio...
research
05/19/2014

A Parallel Way to Select the Parameters of SVM Based on the Ant Optimization Algorithm

A large number of experimental data shows that Support Vector Machine (S...
research
01/30/2023

Massively Scaling Heteroscedastic Classifiers

Heteroscedastic classifiers, which learn a multivariate Gaussian distrib...
research
03/11/2022

Research on Parallel SVM Algorithm Based on Cascade SVM

Cascade SVM (CSVM) can group datasets and train subsets in parallel, whi...
research
08/23/2018

Multiclass Universum SVM

We introduce Universum learning for multiclass problems and propose a no...
research
09/17/2018

Span error bound for weighted SVM with applications in hyperparameter selection

Weighted SVM (or fuzzy SVM) is the most widely used SVM variant owning i...

Please sign up or login with your details

Forgot password? Click here to reset