Optimal Bounds between f-Divergences and Integral Probability Metrics

06/10/2020
by   Rohit Agrawal, et al.
0

The families of f-divergences (e.g. the Kullback-Leibler divergence) and Integral Probability Metrics (e.g. total variation distance or maximum mean discrepancies) are commonly used in optimization and estimation. In this work, we systematically study the relationship between these two families from the perspective of convex duality. Starting from a tight variational representation of the f-divergence, we derive a generalization of the moment generating function, which we show exactly characterizes the best lower bound of the f-divergence as a function of a given IPM. Using this characterization, we obtain new bounds on IPMs defined by classes of unbounded functions, while also recovering in a unified manner well-known results for bounded and subgaussian functions (e.g. Pinsker's inequality and Hoeffding's lemma). The variational representation also allows us to prove new results on the topological properties of the divergence which may be of independent interest.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/01/2022

Tight bounds for augmented KL divergence in terms of augmented total variation distance

We provide optimal variational upper and lower bounds for the augmented ...
research
05/27/2021

Tight Lower Bounds for α-Divergences Under Moment Constraints and Relations Between Different α

The α-divergences include the well-known Kullback-Leibler divergence, He...
research
10/10/2022

Function-space regularized Rényi divergences

We propose a new family of regularized Rényi divergences parametrized no...
research
07/25/2022

Information Processing Equalities and the Information-Risk Bridge

We introduce two new classes of measures of information for statistical ...
research
04/17/2019

Samplers and extractors for unbounded functions

Blasiok (SODA'18) recently introduced the notion of a subgaussian sample...
research
09/14/2019

Sup-sums principles for F-divergence, Kullback–Leibler divergence, and new definition for t-entropy

The article presents new sup-sums principles for integral F-divergence f...
research
05/19/2017

Relaxed Wasserstein with Applications to GANs

We propose a novel class of statistical divergences called Relaxed Wasse...

Please sign up or login with your details

Forgot password? Click here to reset