Revisiting Chernoff Information with Likelihood Ratio Exponential Families

07/08/2022
by   Frank Nielsen, et al.
0

The Chernoff information between two probability measures is a statistical divergence measuring their deviation defined as their maximally skewed Bhattacharyya distance. Although the Chernoff information was originally introduced for bounding the Bayes error in statistical hypothesis testing, the divergence found many other applications due to its empirical robustness property found in applications ranging from information fusion to quantum information. From the viewpoint of information theory, the Chernoff information can also be interpreted as a minmax symmetrization of the Kullback–Leibler divergence. In this paper, we first revisit the Chernoff information between two densities of a measurable Lebesgue space by considering the exponential families induced by their geometric mixtures: The so-called likelihood ratio exponential families. Second, we show how to (i) solve exactly the Chernoff information between any two univariate Gaussian distributions or get a closed-form formula using symbolic computing, (ii) report a closed-form formula of the Chernoff information of centered Gaussians with scaled covariance matrices and (iii) use a fast numerical scheme to approximate the Chernoff information between any two multivariate Gaussian distributions.

READ FULL TEXT
research
05/27/2019

A closed-form formula for the Kullback-Leibler divergence between Cauchy distributions

We report a closed-form expression for the Kullback-Leibler divergence b...
research
04/08/2019

On a generalization of the Jensen-Shannon divergence and the JS-symmetrization of distances relying on abstract means

The Jensen-Shannon divergence is a renown bounded symmetrization of the ...
research
02/14/2011

Chernoff information of exponential families

Chernoff information upper bounds the probability of error of the optima...
research
02/22/2022

The duo Fenchel-Young divergence

By calculating the Kullback-Leibler divergence between two probability m...
research
01/09/2019

The statistical Minkowski distances: Closed-form formula for Gaussian Mixture Models

The traditional Minkowski distances are induced by the corresponding Min...
research
01/14/2017

On Hölder projective divergences

We describe a framework to build distances by measuring the tightness of...
research
07/25/2022

Information Processing Equalities and the Information-Risk Bridge

We introduce two new classes of measures of information for statistical ...

Please sign up or login with your details

Forgot password? Click here to reset