DeepAI AI Chat
Log In Sign Up

Tight Sample Complexity of Large-Margin Learning

11/23/2010
by   Sivan Sabato, et al.
Hebrew University of Jerusalem
Toyota Technological Institute at Chicago
0

We obtain a tight distribution-specific characterization of the sample complexity of large-margin classification with L_2 regularization: We introduce the γ-adapted-dimension, which is a simple function of the spectrum of a distribution's covariance matrix, and show distribution-specific upper and lower bounds on the sample complexity, both governed by the γ-adapted-dimension of the source distribution. We conclude that this new quantity tightly characterizes the true sample complexity of large-margin classification. The bounds hold for a rich family of sub-Gaussian distributions.

READ FULL TEXT

page 1

page 2

page 3

page 4

04/05/2012

Distribution-Dependent Sample Complexity of Large Margin Learning

We obtain a tight distribution-specific characterization of the sample c...
06/01/2019

Graph-based Discriminators: Sample Complexity and Expressiveness

A basic question in learning theory is to identify if two distributions ...
04/18/2023

Impossibility of Characterizing Distribution Learning – a simple solution to a long-standing problem

We consider the long-standing question of finding a parameter of a class...
06/14/2019

A Distribution Dependent and Independent Complexity Analysis of Manifold Regularization

Manifold regularization is a commonly used technique in semi-supervised ...
01/20/2022

Reproducibility in Learning

We introduce the notion of a reproducible algorithm in the context of le...
05/14/2019

Sample Efficient Toeplitz Covariance Estimation

We study the query complexity of estimating the covariance matrix T of a...
05/11/2015

Sample complexity of learning Mahalanobis distance metrics

Metric learning seeks a transformation of the feature space that enhance...