On a Variational Definition for the Jensen-Shannon Symmetrization of Distances based on the Information Radius
We generalize the Jensen-Shannon divergence by considering a variational definition with respect to a generic mean extending thereby the notion of Sibson's information radius. The variational definition applies to any arbitrary distance and yields another way to define a Jensen-Shannon symmetrization of distances. When the variational optimization is further constrained to belong to prescribed probability measure families, we get relative Jensen-Shannon divergences and symmetrizations which generalize the concept of information projections. Finally, we discuss applications of these variational Jensen-Shannon divergences and diversity indices to clustering and quantization tasks of probability measures including statistical mixtures.
READ FULL TEXT