Improving Generalization Bounds for VC Classes Using the Hypergeometric Tail Inversion

10/29/2021
by   Jean-Samuel Leboeuf, et al.
0

We significantly improve the generalization bounds for VC classes by using two main ideas. First, we consider the hypergeometric tail inversion to obtain a very tight non-uniform distribution-independent risk upper bound for VC classes. Second, we optimize the ghost sample trick to obtain a further non-negligible gain. These improvements are then used to derive a relative deviation bound, a multiclass margin bound, as well as a lower bound. Numerical comparisons show that the new bound is nearly never vacuous, and is tighter than other VC bounds for all reasonable data set sizes.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/27/2019

Margin-Based Generalization Lower Bounds for Boosted Classifiers

Boosting is one of the most successful ideas in machine learning. The mo...
research
06/03/2020

Near-Tight Margin-Based Generalization Bounds for Support Vector Machines

Support Vector Machines (SVMs) are among the most fundamental tools for ...
research
03/19/2019

QuickSort: Improved right-tail asymptotics for the limiting distribution, and large deviations

We substantially refine asymptotic logarithmic upper bounds produced by ...
research
06/06/2013

Tight Lower Bound on the Probability of a Binomial Exceeding its Expectation

We give the proof of a tight lower bound on the probability that a binom...
research
05/26/2019

On Coresets for Regularized Loss Minimization

We design and mathematically analyze sampling-based algorithms for regul...
research
09/21/2022

Instance-dependent uniform tail bounds for empirical processes

We formulate a uniform tail bound for empirical processes indexed by a c...
research
03/29/2021

Risk Bounds for Learning via Hilbert Coresets

We develop a formalism for constructing stochastic upper bounds on the e...

Please sign up or login with your details

Forgot password? Click here to reset