DeepAI AI Chat
Log In Sign Up

Local Rademacher Complexity Bounds based on Covering Numbers

by   Yunwen Lei, et al.
Wuhan University
NetEase, Inc

This paper provides a general result on controlling local Rademacher complexities, which captures in an elegant form to relate the complexities with constraint on the expected norm to the corresponding ones with constraint on the empirical norm. This result is convenient to apply in real applications and could yield refined local Rademacher complexity bounds for function classes satisfying general entropy conditions. We demonstrate the power of our complexity bounds by applying them to derive effective generalization error bounds.


page 1

page 2

page 3

page 4


Generalization Bounds for Metric and Similarity Learning

Recently, metric learning and similarity learning have attracted a large...

Almost Global Problems in the LOCAL Model

The landscape of the distributed time complexity is nowadays well-unders...

Error Bounds for Piecewise Smooth and Switching Regression

The paper deals with regression problems, in which the nonsmooth target ...

Towards Empirical Process Theory for Vector-Valued Functions: Metric Entropy of Smooth Function Classes

This paper provides some first steps in developing empirical process the...

Sampling discretization error of integral norms for function classes with small smoothness

We consider infinitely dimensional classes of functions and instead of t...

Localized Complexities for Transductive Learning

We show two novel concentration inequalities for suprema of empirical pr...

Revisiting EXTRA for Smooth Distributed Optimization

EXTRA is a popular method for the dencentralized distributed optimizatio...