Information Theoretic Structure Learning with Confidence

09/13/2016
by   Kevin R. Moon, et al.
0

Information theoretic measures (e.g. the Kullback Liebler divergence and Shannon mutual information) have been used for exploring possibly nonlinear multivariate dependencies in high dimension. If these dependencies are assumed to follow a Markov factor graph model, this exploration process is called structure discovery. For discrete-valued samples, estimates of the information divergence over the parametric class of multinomial models lead to structure discovery methods whose mean squared error achieves parametric convergence rates as the sample size grows. However, a naive application of this method to continuous nonparametric multivariate models converges much more slowly. In this paper we introduce a new method for nonparametric structure discovery that uses weighted ensemble divergence estimators that achieve parametric convergence rates and obey an asymptotic central limit theorem that facilitates hypothesis testing and other types of statistical validation.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/07/2014

Multivariate f-Divergence Estimation With Confidence

The problem of f-divergence estimation is important in the fields of mac...
research
10/26/2018

Estimators for Multivariate Information Measures in General Probability Spaces

Information theoretic quantities play an important role in various setti...
research
09/28/2018

Jensen-Shannon Divergence as a Goodness-of-Fit Measure for Maximum Likelihood Estimation and Curve Fitting

The coefficient of determination, known as R^2, is commonly used as a go...
research
05/07/2020

Nonparametric Estimation of the Fisher Information and Its Applications

This paper considers the problem of estimation of the Fisher information...
research
03/04/2019

Approximations of Shannon Mutual Information for Discrete Variables with Applications to Neural Population Coding

Although Shannon mutual information has been widely used, its effective ...
research
04/19/2022

Information-theoretic Limits for Testing Community Structures in Weighted Networks

Community detection refers to the problem of clustering the nodes of a n...
research
10/26/2022

Asymmetric predictability in causal discovery: an information theoretic approach

Causal investigations in observational studies pose a great challenge in...

Please sign up or login with your details

Forgot password? Click here to reset