Tight bounds for augmented KL divergence in terms of augmented total variation distance

03/01/2022
by   Michele Caprio, et al.
0

We provide optimal variational upper and lower bounds for the augmented Kullback-Leibler divergence in terms of the augmented total variation distance between two probability measures defined on two Euclidean spaces having different dimensions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/12/2022

Lower Bounds for the Total Variation Distance Given Means and Variances of Distributions

For arbitrary two probability measures on real d-space with given means ...
research
02/24/2022

Tighter Expected Generalization Error Bounds via Convexity of Information Measures

Generalization error bounds are essential to understanding machine learn...
research
02/15/2022

A short note on an inequality between KL and TV

The goal of this short note is to discuss the relation between Kullback–...
research
06/10/2020

Optimal Bounds between f-Divergences and Integral Probability Metrics

The families of f-divergences (e.g. the Kullback-Leibler divergence) and...
research
04/16/2021

On the Robustness to Misspecification of α-Posteriors and Their Variational Approximations

α-posteriors and their variational approximations distort standard poste...
research
04/17/2019

Samplers and extractors for unbounded functions

Blasiok (SODA'18) recently introduced the notion of a subgaussian sample...
research
10/24/2022

Contraction of Locally Differentially Private Mechanisms

We investigate the contraction properties of locally differentially priv...

Please sign up or login with your details

Forgot password? Click here to reset