Tight Sample Complexity of Large-Margin Learning

11/23/2010
by   Sivan Sabato, et al.
0

We obtain a tight distribution-specific characterization of the sample complexity of large-margin classification with L_2 regularization: We introduce the γ-adapted-dimension, which is a simple function of the spectrum of a distribution's covariance matrix, and show distribution-specific upper and lower bounds on the sample complexity, both governed by the γ-adapted-dimension of the source distribution. We conclude that this new quantity tightly characterizes the true sample complexity of large-margin classification. The bounds hold for a rich family of sub-Gaussian distributions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/05/2012

Distribution-Dependent Sample Complexity of Large Margin Learning

We obtain a tight distribution-specific characterization of the sample c...
research
06/01/2019

Graph-based Discriminators: Sample Complexity and Expressiveness

A basic question in learning theory is to identify if two distributions ...
research
02/04/2020

Efficient, Noise-Tolerant, and Private Learning via Boosting

We introduce a simple framework for designing private boosting algorithm...
research
06/14/2019

A Distribution Dependent and Independent Complexity Analysis of Manifold Regularization

Manifold regularization is a commonly used technique in semi-supervised ...
research
04/18/2023

Impossibility of Characterizing Distribution Learning – a simple solution to a long-standing problem

We consider the long-standing question of finding a parameter of a class...
research
12/07/2020

VC Dimension and Distribution-Free Sample-Based Testing

We consider the problem of determining which classes of functions can be...
research
05/14/2019

Sample Efficient Toeplitz Covariance Estimation

We study the query complexity of estimating the covariance matrix T of a...

Please sign up or login with your details

Forgot password? Click here to reset