Function-space regularized Rényi divergences

10/10/2022
by   Jeremiah Birrell, et al.
0

We propose a new family of regularized Rényi divergences parametrized not only by the order α but also by a variational function space. These new objects are defined by taking the infimal convolution of the standard Rényi divergence with the integral probability metric (IPM) associated with the chosen function space. We derive a novel dual variational representation that can be used to construct numerically tractable divergence estimators. This representation avoids risk-sensitive terms and therefore exhibits lower variance, making it well-behaved when α>1; this addresses a notable weakness of prior approaches. We prove several properties of these new divergences, showing that they interpolate between the classical Rényi divergences and IPMs. We also study the α→∞ limit, which leads to a regularized worst-case-regret and a new variational representation in the classical case. Moreover, we show that the proposed regularized Rényi divergences inherit features from IPMs such as the ability to compare distributions that are not absolutely continuous, e.g., empirical measures and distributions with low-dimensional support. We present numerical results on both synthetic and real datasets, showing the utility of these new divergences in both estimation and GAN training applications; in particular, we demonstrate significantly reduced variance and improved training performance.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/07/2020

A Variational Formula for Rényi Divergences

We derive a new variational formula for the Rényi family of divergences,...
research
06/10/2020

Optimal Bounds between f-Divergences and Integral Probability Metrics

The families of f-divergences (e.g. the Kullback-Leibler divergence) and...
research
11/11/2020

(f,Γ)-Divergences: Interpolating between f-Divergences and Integral Probability Metrics

We develop a general framework for constructing new information-theoreti...
research
12/21/2018

Asymptotic distribution and convergence rates of stochastic algorithms for entropic optimal transportation between probability measures

This paper is devoted to the stochastic approximation of entropically re...
research
10/31/2022

Lipschitz regularized gradient flows and latent generative particles

Lipschitz regularized f-divergences are constructed by imposing a bound ...
research
07/25/2022

Information Processing Equalities and the Information-Risk Bridge

We introduce two new classes of measures of information for statistical ...
research
06/18/2012

Tighter Variational Representations of f-Divergences via Restriction to Probability Measures

We show that the variational representations for f-divergences currently...

Please sign up or login with your details

Forgot password? Click here to reset