On Relations Between Tight Bounds for Symmetric f-Divergences and Binary Divergences

10/18/2022
by   Tomohiro Nishiyama, et al.
0

Minimizing divergence measures under a constraint is an important problem. We derive a sufficient condition that binary divergence measures provide lower bounds for symmetric divergence measures under a given triangular discrimination or given means and variances. Assuming this sufficient condition, the former bounds are always tight, and the latter bounds are tight when two probability measures have the same variance. An application of these results for nonequilibrium physics is provided.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/27/2021

Tight Lower Bounds for α-Divergences Under Moment Constraints and Relations Between Different α

The α-divergences include the well-known Kullback-Leibler divergence, He...
research
03/20/2019

Note on bounds for symmetric divergence measures

I. Sason obtained the tight bounds for symmetric divergence measures are...
research
10/26/2020

A Tight Lower Bound for the Hellinger Distance with Given Means and Variances

The binary divergences that are divergences between probability measures...
research
02/07/2020

Bounds on the Information Divergence for Hypergeometric Distributions

The hypergeometric distributions have many important applications, but t...
research
08/10/2020

Probability Link Models with Symmetric Information Divergence

This paper introduces link functions for transforming one probability di...
research
09/02/2020

Properties of f-divergences and f-GAN training

In this technical report we describe some properties of f-divergences an...
research
09/17/2018

On Minimal Copulas under the Concordance Order

In the present paper, we study extreme negative dependence focussing on ...

Please sign up or login with your details

Forgot password? Click here to reset