A non-negative expansion for small Jensen-Shannon Divergences

10/28/2008
by   Anil Raj, et al.
0

In this report, we derive a non-negative series expansion for the Jensen-Shannon divergence (JSD) between two probability distributions. This series expansion is shown to be useful for numerical calculations of the JSD, when the probability distributions are nearly equal, and for which, consequently, small numerical errors dominate evaluation.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/02/2019

On a generalization of the Jensen-Shannon divergence

The Jensen-Shannon divergence is a renown bounded symmetrization of the ...
research
09/05/2018

Bregman divergences based on optimal design criteria and simplicial measures of dispersion

In previous work the authors defined the k-th order simplicial distance ...
research
10/01/2019

Monotonically Decreasing Sequence of Divergences

Divergences are quantities that measure discrepancy between two probabil...
research
12/05/2018

A Short Note on the Jensen-Shannon Divergence between Simple Mixture Distributions

This short note presents results about the symmetric Jensen-Shannon dive...
research
08/25/2021

Excess and Deficiency of Extreme Multidimensional Random Fields

Probability distributions and densities are derived for the excess and d...
research
10/27/2017

Probability Series Expansion Classifier that is Interpretable by Design

This work presents a new classifier that is specifically designed to be ...
research
02/20/2023

Metrization of powers of the Jensen-Shannon divergence

Metrization of statistical divergences is useful in both theoretical and...

Please sign up or login with your details

Forgot password? Click here to reset