Hypothesis Set Stability and Generalization

04/09/2019
by   Dylan J. Foster, et al.
0

We present an extensive study of generalization for data-dependent hypothesis sets. We give a general learning guarantee for data-dependent hypothesis sets based on a notion of transductive Rademacher complexity. Our main results are two generalization bounds for data-dependent hypothesis sets expressed in terms of a notion of hypothesis set stability and a notion of Rademacher complexity for data-dependent hypothesis sets that we introduce. These bounds admit as special cases both standard Rademacher complexity bounds and algorithm-dependent uniform stability bounds. We also illustrate the use of these learning bounds in the analysis of several scenarios.

READ FULL TEXT
research
07/21/2020

On the Rademacher Complexity of Linear Hypothesis Sets

Linear predictors form a rich class of hypotheses used in a variety of l...
research
06/21/2019

Learning from weakly dependent data under Dobrushin's condition

Statistical learning theory has largely focused on learning and generali...
research
02/06/2023

Generalization Bounds with Data-dependent Fractal Dimensions

Providing generalization guarantees for modern neural networks has been ...
research
06/21/2021

Complexity-Free Generalization via Distributionally Robust Optimization

Established approaches to obtain generalization bounds in data-driven op...
research
03/21/2023

Uniform Risk Bounds for Learning with Dependent Data Sequences

This paper extends standard results from learning theory with independen...
research
01/26/2019

Stacking and stability

Stacking is a general approach for combining multiple models toward grea...
research
03/09/2023

Data-dependent Generalization Bounds via Variable-Size Compressibility

In this paper, we establish novel data-dependent upper bounds on the gen...

Please sign up or login with your details

Forgot password? Click here to reset