A vector-contraction inequality for Rademacher complexities

05/01/2016
by   Andreas Maurer, et al.
0

The contraction inequality for Rademacher averages is extended to Lipschitz functions with vector-valued domains, and it is also shown that in the bounding expression the Rademacher variables can be replaced by arbitrary iid symmetric and sub-gaussian variables. Example applications are given for multi-category learning, K-means clustering and learning-to-learn.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/20/2019

A vector-contraction inequality for Rademacher complexities using p-stable variables

Andreas Maurer in the paper "A vector-contraction inequality for Rademac...
research
11/15/2019

ℓ_∞ Vector Contraction for Rademacher Complexity

We show that the Rademacher complexity of any R^K-valued function class ...
research
02/22/2020

Optimistic bounds for multi-output prediction

We investigate the challenge of multi-output learning, where the goal is...
research
10/26/2018

Hanson-Wright inequality in Hilbert spaces with application to K-means clustering for non-Euclidean data

We derive a dimensional-free Hanson-Wright inequality for quadratic form...
research
06/05/2016

Bounds for Vector-Valued Function Estimation

We present a framework to derive risk bounds for vector-valued learning ...
research
09/20/2020

Skewed probit regression – Identifiability, contraction and reformulation

Skewed probit regression is but one example of a statistical model that ...
research
09/20/2019

Computation and verification of contraction metrics for exponentially stable equilibria

The determination of exponentially stable equilibria and their basin of ...

Please sign up or login with your details

Forgot password? Click here to reset