Nonparanormal Information Estimation

02/24/2017
by   Shashank Singh, et al.
0

We study the problem of using i.i.d. samples from an unknown multivariate probability distribution p to estimate the mutual information of p. This problem has recently received attention in two settings: (1) where p is assumed to be Gaussian and (2) where p is assumed only to lie in a large nonparametric smoothness class. Estimators proposed for the Gaussian case converge in high dimensions when the Gaussian assumption holds, but are brittle, failing dramatically when p is not Gaussian. Estimators proposed for the nonparametric case fail to converge with realistic sample sizes except in very low dimensions. As a result, there is a lack of robust mutual information estimators for many realistic data. To address this, we propose estimators for mutual information when p is assumed to be a nonparanormal (a.k.a., Gaussian copula) model, a semiparametric compromise between Gaussian and nonparametric extremes. Using theoretical bounds and experiments, we show these estimators strike a practical balance between robustness and scaling with dimensionality.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/19/2023

Beyond Normal: On the Evaluation of Mutual Information Estimators

Mutual information is a general statistical dependency measure which has...
research
03/09/2010

Estimation of Rényi Entropy and Mutual Information Based on Generalized Nearest-Neighbor Graphs

We present simple and computationally efficient nonparametric estimators...
research
05/09/2022

The Compound Information Bottleneck Outlook

We formulate and analyze the compound information bottleneck programming...
research
11/02/2017

Geometric k-nearest neighbor estimation of entropy and mutual information

Like most nonparametric estimators of information functionals involving ...
research
12/12/2021

Optimal Partitions for Nonparametric Multivariate Entropy Estimation

Efficient and accurate estimation of multivariate empirical probability ...
research
06/12/2020

On Neural Estimators for Conditional Mutual Information Using Nearest Neighbors Sampling

The estimation of mutual information (MI) or conditional mutual informat...
research
06/01/2023

On the Effectiveness of Hybrid Mutual Information Estimation

Estimating the mutual information from samples from a joint distribution...

Please sign up or login with your details

Forgot password? Click here to reset