Multiclass learning with margin: exponential rates with no bias-variance trade-off

02/03/2022
by   Stefano Vigogna, et al.
0

We study the behavior of error bounds for multiclass classification under suitable margin conditions. For a wide variety of methods we prove that the classification error under a hard-margin condition decreases exponentially fast without any bias-variance trade-off. Different convergence rates can be obtained in correspondence of different margin assumptions. With a self-contained and instructive analysis we are able to generalize known results from the binary to the multiclass setting.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/20/2022

A Case of Exponential Convergence Rates for SVM

Classification is often the first problem described in introductory mach...
research
10/28/2019

Fast classification rates without standard margin assumptions

We consider the classical problem of learning rates for classes with fin...
research
06/20/2014

Towards A Deeper Geometric, Analytic and Algorithmic Understanding of Margins

Given a matrix A, a linear feasibility problem (of which linear classifi...
research
02/17/2016

Harder, Better, Faster, Stronger Convergence Rates for Least-Squares Regression

We consider the optimization of a quadratic objective function whose gra...
research
02/08/2019

Beyond Least-Squares: Fast Rates for Regularized Empirical Risk Minimization through Self-Concordance

We consider learning methods based on the regularization of a convex emp...
research
06/18/2012

Is margin preserved after random projection?

Random projections have been applied in many machine learning algorithms...
research
04/26/2022

Bias-Variance Decompositions for Margin Losses

We introduce a novel bias-variance decomposition for a range of strictly...

Please sign up or login with your details

Forgot password? Click here to reset