DeepAI AI Chat
Log In Sign Up

Local Rademacher Complexity Bounds based on Covering Numbers

10/06/2015
by   Yunwen Lei, et al.
Wuhan University
NetEase, Inc
0

This paper provides a general result on controlling local Rademacher complexities, which captures in an elegant form to relate the complexities with constraint on the expected norm to the corresponding ones with constraint on the empirical norm. This result is convenient to apply in real applications and could yield refined local Rademacher complexity bounds for function classes satisfying general entropy conditions. We demonstrate the power of our complexity bounds by applying them to derive effective generalization error bounds.

READ FULL TEXT

page 1

page 2

page 3

page 4

07/23/2012

Generalization Bounds for Metric and Similarity Learning

Recently, metric learning and similarity learning have attracted a large...
05/12/2018

Almost Global Problems in the LOCAL Model

The landscape of the distributed time complexity is nowadays well-unders...
07/25/2017

Error Bounds for Piecewise Smooth and Switching Regression

The paper deals with regression problems, in which the nonsmooth target ...
02/09/2022

Towards Empirical Process Theory for Vector-Valued Functions: Metric Entropy of Smooth Function Classes

This paper provides some first steps in developing empirical process the...
03/14/2022

Sampling discretization error of integral norms for function classes with small smoothness

We consider infinitely dimensional classes of functions and instead of t...
11/26/2014

Localized Complexities for Transductive Learning

We show two novel concentration inequalities for suprema of empirical pr...
02/24/2020

Revisiting EXTRA for Smooth Distributed Optimization

EXTRA is a popular method for the dencentralized distributed optimizatio...