The dually flat information geometry of the mixture family of two prescribed Cauchy components
In information geometry, a strictly convex and smooth function induces a dually flat Hessian manifold equipped with a pair of dual Bregman divergences, hereby termed a Bregman manifold. Two common types of such Bregman manifolds met in statistics are (1) the exponential family manifolds induced by the cumulant functions of regular exponential families, and (2) the mixture family manifolds induced by the Shannon negentropies of statistical mixture families with prescribed linearly independent mixture components. However, the differential entropy of a mixture of continuous probability densities sharing the same support is hitherto not known in closed form making implementation of mixture family manifolds in practice difficult. In this work, we report an exception: The family of mixtures of two prescribed and distinct Cauchy distributions. We exemplify the explicit construction of a dually flat manifold induced by the differential negentropy for this very particular setting. This construction allows one to use the geometric toolbox of Bregman algorithms, and to obtain closed-form formula (albeit being large) for the Kullback-Leibler divergence and the Jensen-Shannon divergence between two mixtures of two prescribed Cauchy components.
READ FULL TEXT