A family of statistical symmetric divergences based on Jensen's inequality

09/21/2010
by   Frank Nielsen, et al.
0

We introduce a novel parametric family of symmetric information-theoretic distances based on Jensen's inequality for a convex functional generator. In particular, this family unifies the celebrated Jeffreys divergence with the Jensen-Shannon divergence when the Shannon entropy generator is chosen. We then design a generic algorithm to compute the unique centroid defined as the minimum average divergence. This yields a smooth family of centroids linking the Jeffreys to the Jensen-Shannon centroid. Finally, we report on our experimental results.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/02/2019

On a generalization of the Jensen-Shannon divergence

The Jensen-Shannon divergence is a renown bounded symmetrization of the ...
research
04/08/2019

On a generalization of the Jensen-Shannon divergence and the JS-symmetrization of distances relying on abstract means

The Jensen-Shannon divergence is a renown bounded symmetrization of the ...
research
12/05/2018

A Short Note on the Jensen-Shannon Divergence between Simple Mixture Distributions

This short note presents results about the symmetric Jensen-Shannon dive...
research
02/19/2021

On a Variational Definition for the Jensen-Shannon Symmetrization of Distances based on the Information Radius

We generalize the Jensen-Shannon divergence by considering a variational...
research
12/01/2019

Adaptive Divergence for Rapid Adversarial Optimization

Adversarial Optimization (AO) provides a reliable, practical way to matc...
research
11/06/2019

Metrics Induced by Quantum Jensen-Shannon-Renyí and Related Divergences

We study symmetric divergences on Hermitian positive definite matrices g...
research
02/20/2023

Metrization of powers of the Jensen-Shannon divergence

Metrization of statistical divergences is useful in both theoretical and...

Please sign up or login with your details

Forgot password? Click here to reset