On the Rademacher Complexity of Linear Hypothesis Sets

07/21/2020
by   Pranjal Awasthi, et al.
0

Linear predictors form a rich class of hypotheses used in a variety of learning algorithms. We present a tight analysis of the empirical Rademacher complexity of the family of linear hypothesis classes with weight vectors bounded in ℓ_p-norm for any p ≥ 1. This provides a tight analysis of generalization using these hypothesis sets and helps derive sharp data-dependent learning guarantees. We give both upper and lower bounds on the Rademacher complexity of these families and show that our bounds improve upon or match existing bounds, which are known only for 1 ≤ p ≤ 2.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/09/2019

Hypothesis Set Stability and Generalization

We present an extensive study of generalization for data-dependent hypot...
research
04/28/2020

Adversarial Learning Guarantees for Linear Hypotheses and Neural Networks

Adversarial or test time robustness measures the susceptibility of a cla...
research
05/30/2014

Generalization Bounds for Learning with Linear, Polygonal, Quadratic and Conic Side Knowledge

In this paper, we consider a supervised learning setting where side know...
research
03/13/2023

Submatrices with the best-bounded inverses: revisiting the hypothesis

The following hypothesis was put forward by Goreinov, Tyrtyshnikov and Z...
research
03/09/2020

Approximate is Good Enough: Probabilistic Variants of Dimensional and Margin Complexity

We present and study approximate notions of dimensional and margin compl...
research
03/29/2021

Risk Bounds for Learning via Hilbert Coresets

We develop a formalism for constructing stochastic upper bounds on the e...
research
06/26/2020

Relative Deviation Margin Bounds

We present a series of new and more favorable margin-based learning guar...

Please sign up or login with your details

Forgot password? Click here to reset