On a generalization of the Jensen-Shannon divergence and the JS-symmetrization of distances relying on abstract means

04/08/2019
by   Frank Nielsen, et al.
0

The Jensen-Shannon divergence is a renown bounded symmetrization of the unbounded Kullback-Leibler divergence which measures the total Kullback-Leibler divergence to the average mixture distribution. However the Jensen-Shannon divergence between Gaussian distributions is not available in closed-form. To bypass this problem, we present a generalization of the Jensen-Shannon (JS) divergence using abstract means which yields closed-form expressions when the mean is chosen according to the parametric family of distributions. More generally, we define the JS-symmetrizations of any statistical distance using mixtures derived from abstract means. In particular, we first show that the geometric mean is well-suited for exponential families, and report closed-form formula for the geometric Jensen-Shannon divergence and the geometric JS-symmetrization of the reverse Kullback-Leibler divergence. As a second illustrating example, we show that the harmonic mean is well-suited for the scale Cauchy distributions, and report a closed-form formula for the harmonic Jensen-Shannon divergence between scale Cauchy distributions. Applications to clustering with respect to these novel Jensen-Shannon divergences are touched upon.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset