On a generalization of the Jensen-Shannon divergence and the JS-symmetrization of distances relying on abstract means

04/08/2019
by   Frank Nielsen, et al.
0

The Jensen-Shannon divergence is a renown bounded symmetrization of the unbounded Kullback-Leibler divergence which measures the total Kullback-Leibler divergence to the average mixture distribution. However the Jensen-Shannon divergence between Gaussian distributions is not available in closed-form. To bypass this problem, we present a generalization of the Jensen-Shannon (JS) divergence using abstract means which yields closed-form expressions when the mean is chosen according to the parametric family of distributions. More generally, we define the JS-symmetrizations of any statistical distance using mixtures derived from abstract means. In particular, we first show that the geometric mean is well-suited for exponential families, and report closed-form formula for the geometric Jensen-Shannon divergence and the geometric JS-symmetrization of the reverse Kullback-Leibler divergence. As a second illustrating example, we show that the harmonic mean is well-suited for the scale Cauchy distributions, and report a closed-form formula for the harmonic Jensen-Shannon divergence between scale Cauchy distributions. Applications to clustering with respect to these novel Jensen-Shannon divergences are touched upon.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/27/2019

A closed-form formula for the Kullback-Leibler divergence between Cauchy distributions

We report a closed-form expression for the Kullback-Leibler divergence b...
research
02/19/2021

On a Variational Definition for the Jensen-Shannon Symmetrization of Distances based on the Information Radius

We generalize the Jensen-Shannon divergence by considering a variational...
research
01/14/2017

On Hölder projective divergences

We describe a framework to build distances by measuring the tightness of...
research
07/08/2022

Revisiting Chernoff Information with Likelihood Ratio Exponential Families

The Chernoff information between two probability measures is a statistic...
research
01/27/2020

A generalization of the α-divergences based on comparable and distinct weighted means

We generalize the renown family of α-divergences in information geometry...
research
09/21/2010

A family of statistical symmetric divergences based on Jensen's inequality

We introduce a novel parametric family of symmetric information-theoreti...
research
03/30/2020

A note on Onicescu's informational energy and correlation coefficient in exponential families

The informational energy of Onicescu is a positive quantity that measure...

Please sign up or login with your details

Forgot password? Click here to reset