DeepAI AI Chat
Log In Sign Up

Tight Sample Complexity of Large-Margin Learning

by   Sivan Sabato, et al.
Hebrew University of Jerusalem
Toyota Technological Institute at Chicago

We obtain a tight distribution-specific characterization of the sample complexity of large-margin classification with L_2 regularization: We introduce the γ-adapted-dimension, which is a simple function of the spectrum of a distribution's covariance matrix, and show distribution-specific upper and lower bounds on the sample complexity, both governed by the γ-adapted-dimension of the source distribution. We conclude that this new quantity tightly characterizes the true sample complexity of large-margin classification. The bounds hold for a rich family of sub-Gaussian distributions.


page 1

page 2

page 3

page 4


Distribution-Dependent Sample Complexity of Large Margin Learning

We obtain a tight distribution-specific characterization of the sample c...

Graph-based Discriminators: Sample Complexity and Expressiveness

A basic question in learning theory is to identify if two distributions ...

Impossibility of Characterizing Distribution Learning – a simple solution to a long-standing problem

We consider the long-standing question of finding a parameter of a class...

A Distribution Dependent and Independent Complexity Analysis of Manifold Regularization

Manifold regularization is a commonly used technique in semi-supervised ...

Reproducibility in Learning

We introduce the notion of a reproducible algorithm in the context of le...

Sample Efficient Toeplitz Covariance Estimation

We study the query complexity of estimating the covariance matrix T of a...

Sample complexity of learning Mahalanobis distance metrics

Metric learning seeks a transformation of the feature space that enhance...