Estimating Mixture Entropy with Pairwise Distances

06/08/2017
by   Artemy Kolchinsky, et al.
0

Mixture distributions arise in many parametric and non-parametric settings - for example, in Gaussian mixture models and in non-parametric estimation. It is often necessary to compute the entropy of a mixture, but, in most cases, this quantity has no closed-form expression, making some form of approximation necessary. We propose a family of estimators based on a pairwise distance function between mixture components, and show that this estimator class has many attractive properties. For many distributions of interest, the proposed estimators are efficient to compute, differentiable in the mixture parameters, and become exact when the mixture components are clustered. We prove this family includes lower and upper bounds on the mixture entropy. The Chernoff α-divergence gives a lower bound when chosen as the distance function, with the Bhattacharyaa distance providing the tightest lower bound for components that are symmetric and members of a location family. The Kullback-Leibler divergence gives an upper bound when used as the distance function. We provide closed-form expressions of these bounds for mixtures of Gaussians, and discuss their applications to the estimation of mutual information. We then demonstrate that our bounds are significantly tighter than well-known existing bounds using numeric simulations. This estimator class is very useful in optimization problems involving maximization/minimization of entropy and mutual information, such as MaxEnt and rate distortion problems.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/09/2023

Mathematical Model of Quantum Channel Capacity

In this article, we are proposing a closed-form solution for the capacit...
research
10/12/2021

Information Theoretic Structured Generative Modeling

Rényi's information provides a theoretical foundation for tractable and ...
research
10/24/2022

Contraction of Locally Differentially Private Mechanisms

We investigate the contraction properties of locally differentially priv...
research
10/25/2021

Faster estimation for constrained gamma mixture models using closed-form estimators

Mixture models are useful in a wide array of applications to identify su...
research
02/05/2022

Lower-bounds on the Bayesian Risk in Estimation Procedures via f-Divergences

We consider the problem of parameter estimation in a Bayesian setting an...
research
06/19/2016

Guaranteed bounds on the Kullback-Leibler divergence of univariate mixtures using piecewise log-sum-exp inequalities

Information-theoretic measures such as the entropy, cross-entropy and th...
research
12/22/2021

Robust learning of data anomalies with analytically-solvable entropic outlier sparsification

Entropic Outlier Sparsification (EOS) is proposed as a robust computatio...

Please sign up or login with your details

Forgot password? Click here to reset