Distribution-Dependent Sample Complexity of Large Margin Learning

04/05/2012
by   Sivan Sabato, et al.
0

We obtain a tight distribution-specific characterization of the sample complexity of large-margin classification with L2 regularization: We introduce the margin-adapted dimension, which is a simple function of the second order statistics of the data distribution, and show distribution-specific upper and lower bounds on the sample complexity, both governed by the margin-adapted dimension of the data distribution. The upper bounds are universal, and the lower bounds hold for the rich family of sub-Gaussian distributions with independent features. We conclude that this new quantity tightly characterizes the true sample complexity of large-margin classification. To prove the lower bound, we develop several new tools of independent interest. These include new connections between shattering and hardness of learning, new properties of shattering with linear classifiers, and a new lower bound on the smallest eigenvalue of a random Gram matrix generated by sub-Gaussian variables. Our results can be used to quantitatively compare large margin learning to other learning rules, and to improve the effectiveness of methods that use sample complexity bounds, such as active learning.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/23/2010

Tight Sample Complexity of Large-Margin Learning

We obtain a tight distribution-specific characterization of the sample c...
research
03/25/2019

Sample Complexity Lower Bounds for Linear System Identification

This paper establishes problem-specific sample complexity lower bounds f...
research
02/04/2020

Efficient, Noise-Tolerant, and Private Learning via Boosting

We introduce a simple framework for designing private boosting algorithm...
research
05/11/2015

Sample complexity of learning Mahalanobis distance metrics

Metric learning seeks a transformation of the feature space that enhance...
research
06/01/2023

Provable Benefit of Mixup for Finding Optimal Decision Boundaries

We investigate how pair-wise data augmentation techniques like Mixup aff...
research
06/20/2014

Noise-adaptive Margin-based Active Learning and Lower Bounds under Tsybakov Noise Condition

We present a simple noise-robust margin-based active learning algorithm ...
research
02/20/2021

Generalization bounds for graph convolutional neural networks via Rademacher complexity

This paper aims at studying the sample complexity of graph convolutional...

Please sign up or login with your details

Forgot password? Click here to reset