Tighter Variational Representations of f-Divergences via Restriction to Probability Measures

06/18/2012
by   Avraham Ruderman, et al.
0

We show that the variational representations for f-divergences currently used in the literature can be tightened. This has implications to a number of methods recently proposed based on this representation. As an example application we use our tighter representation to derive a general f-divergence estimator based on two i.i.d. samples and derive the dual program for this estimator that performs well empirically. We also point out a connection between our estimator and MMD.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/24/2020

Variational Determinant Estimation with Spherical Normalizing Flows

This paper introduces the Variational Determinant Estimator (VDE), a var...
research
07/07/2020

A Variational Formula for Rényi Divergences

We derive a new variational formula for the Rényi family of divergences,...
research
06/02/2021

A Differentiable Point Process with Its Application to Spiking Neural Networks

This paper is concerned about a learning algorithm for a probabilistic m...
research
05/25/2023

The Representation Jensen-Shannon Divergence

Statistical divergences quantify the difference between probability dist...
research
05/18/2018

Strongly Consistent of Kullback-Leibler Divergence Estimator and Tests for Model Selection Based on a Bias Reduced Kernel Density Estimator

In this paper, we study the strong consistency of a bias reduced kernel ...
research
10/10/2022

Function-space regularized Rényi divergences

We propose a new family of regularized Rényi divergences parametrized no...
research
10/08/2019

Credible Sample Elicitation by Deep Learning, for Deep Learning

It is important to collect credible training samples (x,y) for building ...

Please sign up or login with your details

Forgot password? Click here to reset