Local Rademacher Complexity Bounds based on Covering Numbers

10/06/2015
by   Yunwen Lei, et al.
0

This paper provides a general result on controlling local Rademacher complexities, which captures in an elegant form to relate the complexities with constraint on the expected norm to the corresponding ones with constraint on the empirical norm. This result is convenient to apply in real applications and could yield refined local Rademacher complexity bounds for function classes satisfying general entropy conditions. We demonstrate the power of our complexity bounds by applying them to derive effective generalization error bounds.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/23/2012

Generalization Bounds for Metric and Similarity Learning

Recently, metric learning and similarity learning have attracted a large...
research
05/12/2018

Almost Global Problems in the LOCAL Model

The landscape of the distributed time complexity is nowadays well-unders...
research
07/25/2017

Error Bounds for Piecewise Smooth and Switching Regression

The paper deals with regression problems, in which the nonsmooth target ...
research
02/09/2022

Towards Empirical Process Theory for Vector-Valued Functions: Metric Entropy of Smooth Function Classes

This paper provides some first steps in developing empirical process the...
research
07/03/2023

A maximal inequality for local empirical processes under weak dependence

We introduce a maximal inequality for a local empirical process under st...
research
11/26/2014

Localized Complexities for Transductive Learning

We show two novel concentration inequalities for suprema of empirical pr...
research
05/29/2019

Improved Generalisation Bounds for Deep Learning Through L^∞ Covering Numbers

Using proof techniques involving L^∞ covering numbers, we show generalis...

Please sign up or login with your details

Forgot password? Click here to reset