A Tight Lower Bound for the Hellinger Distance with Given Means and Variances

10/26/2020
by   Tomohiro Nishiyama, et al.
0

The binary divergences that are divergences between probability measures defined on the same 2-point set have an interesting property. For the chi-squared divergence and the relative entropy, it is known that their binary divergence attain lower bounds with given means and variances, respectively. In this note, we show that the binary divergence of the squared Hellinger distance has the same property and propose an open problem that what conditions are needed for f-divergence to satisfy this property.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/27/2021

Tight Lower Bounds for α-Divergences Under Moment Constraints and Relations Between Different α

The α-divergences include the well-known Kullback-Leibler divergence, He...
research
10/18/2022

On Relations Between Tight Bounds for Symmetric f-Divergences and Binary Divergences

Minimizing divergence measures under a constraint is an important proble...
research
04/09/2020

Lecture Note on LCSSX's Lower Bounds for Non-Adaptive Distribution-free Property Testing

In this lecture note we give Liu-Chen-Servedio-Sheng-Xie's (LCSSX) lower...
research
09/30/2018

On Exact and ∞-Rényi Common Informations

Recently, two extensions of Wyner's common information — exact and Rényi...
research
03/20/2019

Note on bounds for symmetric divergence measures

I. Sason obtained the tight bounds for symmetric divergence measures are...
research
08/05/2018

Prediction in Riemannian metrics derived from divergence functions

Divergence functions are interesting discrepancy measures. Even though t...
research
09/05/2018

Bregman divergences based on optimal design criteria and simplicial measures of dispersion

In previous work the authors defined the k-th order simplicial distance ...

Please sign up or login with your details

Forgot password? Click here to reset