A Short Note on the Jensen-Shannon Divergence between Simple Mixture Distributions

12/05/2018
by   Bernhard C. Geiger, et al.
0

This short note presents results about the symmetric Jensen-Shannon divergence between two discrete mixture distributions p_1 and p_2. Specifically, for i=1,2, p_i is the mixture of a common distribution q and a distribution p̃_i with mixture proportion λ_i. In general, p̃_1≠p̃_2 and λ_1≠λ_2. We provide experimental and theoretical insight to the behavior of the symmetric Jensen-Shannon divergence between p_1 and p_2 as the mixture proportions or the divergence between p̃_1 and p̃_2 change. We also provide insight into scenarios where the supports of the distributions p̃_1, p̃_2, and q do not coincide.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/02/2019

On a generalization of the Jensen-Shannon divergence

The Jensen-Shannon divergence is a renown bounded symmetrization of the ...
research
01/29/2021

On f-divergences between Cauchy distributions

We prove that the f-divergences between univariate Cauchy distributions ...
research
02/20/2023

Metrization of powers of the Jensen-Shannon divergence

Metrization of statistical divergences is useful in both theoretical and...
research
05/25/2023

The Representation Jensen-Shannon Divergence

Statistical divergences quantify the difference between probability dist...
research
09/21/2010

A family of statistical symmetric divergences based on Jensen's inequality

We introduce a novel parametric family of symmetric information-theoreti...
research
02/25/2020

A short note on learning discrete distributions

The goal of this short note is to provide simple proofs for the "folklor...
research
10/28/2008

A non-negative expansion for small Jensen-Shannon Divergences

In this report, we derive a non-negative series expansion for the Jensen...

Please sign up or login with your details

Forgot password? Click here to reset