Variations and extensions of the Gaussian concentration inequality, Part II

03/23/2022
by   Daniel J. Fresen, et al.
0

Pisier's version of the Gaussian concentration inequality is transformed and used to prove deviation inequalities for locally Lipschitz functions with respect to heavy tailed product measures on Euclidean space. The approach is, in our opinion, more direct than much of the modern theory of concentration of measure (i.e. Poincaré and log-Sobolev inequalities, estimating moments etc.).

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/11/2021

Some Hoeffding- and Bernstein-type Concentration Inequalities

We prove concentration inequalities for functions of independent random ...
research
10/15/2022

On Catoni's M-Estimation

Catoni proposed a robust M-estimator and gave the deviation inequality f...
research
05/09/2018

Concentration inequalities for randomly permuted sums

Initially motivated by the study of the non-asymptotic properties of non...
research
09/07/2019

Concentration of kernel matrices with application to kernel spectral clustering

We study the concentration of random kernel matrices around their mean. ...
research
04/04/2023

q-Partitioning Valuations: Exploring the Space Between Subadditive and Fractionally Subadditive Valuations

For a set M of m elements, we define a decreasing chain of classes of no...
research
07/11/2019

Computational Concentration of Measure: Optimal Bounds, Reductions, and More

Product measures of dimension n are known to be concentrated in Hamming ...
research
09/18/2020

Deviation bound for non-causal machine learning

Concentration inequality are widely used for analysing machines learning...

Please sign up or login with your details

Forgot password? Click here to reset