Estimating mutual information and multi--information in large networks

02/03/2005
by   Noam Slonim, et al.
0

We address the practical problems of estimating the information relations that characterize large networks. Building on methods developed for analysis of the neural code, we show that reliable estimates of mutual information can be obtained with manageable computational effort. The same methods allow estimation of higher order, multi--information terms. These ideas are illustrated by analyses of gene expression, financial markets, and consumer preferences. In each case, information theoretic measures correlate with independent, intuitive measures of the underlying structures in the system.

READ FULL TEXT
research
06/13/2019

Factorized Mutual Information Maximization

We investigate the sets of joint probability distributions that maximize...
research
03/21/2019

Decomposing information into copying versus transformation

In many real-world systems, information can be transmitted in two qualit...
research
04/22/2019

Learning gradient-based ICA by neurally estimating mutual information

Several methods of estimating the mutual information of random variables...
research
12/19/2014

Information-Theoretic Methods for Identifying Relationships among Climate Variables

Information-theoretic quantities, such as entropy, are used to quantify ...
research
09/14/2023

Generalized Decomposition of Multivariate Information

Since its introduction, the partial information decomposition (PID) has ...
research
05/06/2020

Regularized Estimation of Information via High Dimensional Canonical Correlation Analysis

In recent years, there has been an upswing of interest in estimating inf...
research
10/09/2019

On the Possibility of Rewarding Structure Learning Agents: Mutual Information on Linguistic Random Sets

We present a first attempt to elucidate an Information-Theoretic approac...

Please sign up or login with your details

Forgot password? Click here to reset