On change of measure inequalities for f-divergences

02/11/2022
by   Antoine Picard-Weibel, et al.
30

We propose new change of measure inequalities based on f-divergences (of which the Kullback-Leibler divergence is a particular case). Our strategy relies on combining the Legendre transform of f-divergences and the Young-Fenchel inequality. By exploiting these new change of measure inequalities, we derive new PAC-Bayesian generalisation bounds with a complexity involving f-divergences, and holding in mostly unchartered settings (such as heavy-tailed losses). We instantiate our results for the most popular f-divergences.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/25/2020

Novel Change of Measure Inequalities and PAC-Bayesian Bounds

PAC-Bayesian theory has received a growing attention in the machine lear...
research
02/04/2019

L^p-norm inequality using q-moment and its applications

For a measurable function on a set which has a finite measure, an inequa...
research
04/05/2023

Randomized and Exchangeable Improvements of Markov's, Chebyshev's and Chernoff's Inequalities

We present simple randomized and exchangeable improvements of Markov's i...
research
05/20/2019

PAC-Bayes under potentially heavy tails

We derive PAC-Bayesian learning guarantees for heavy-tailed losses, and ...
research
02/08/2022

From Generalisation Error to Transportation-cost Inequalities and Back

In this work, we connect the problem of bounding the expected generalisa...
research
03/07/2023

Towards a Complete Analysis of Langevin Monte Carlo: Beyond Poincaré Inequality

Langevin diffusions are rapidly convergent under appropriate functional ...
research
07/05/2023

Directed Poincaré Inequalities and L^1 Monotonicity Testing of Lipschitz Functions

We study the connection between directed isoperimetric inequalities and ...

Please sign up or login with your details

Forgot password? Click here to reset