Exact mutual information for lognormal random variables

06/23/2023
by   Maurycy Chwiłka, et al.
0

Stochastic correlated observables with lognormal distribution are ubiquitous in nature, and hence they deserve an exact information-theoretic characterization. We derive a general analytical formula for mutual information between vectors of lognormally distributed random variables, and provide lower and upper bounds on its value. That formula and its bounds involve determinants and traces of high dimensional covariance matrices of these variables. Exact explicit forms of mutual information are calculated for some special cases and types of correlations. As an example, we provide an analytic formula for mutual information between neurons, relevant for neural networks in the brain.

READ FULL TEXT
research
12/27/2018

On mutual information estimation for mixed-pair random variables

We study the mutual information estimation for mixed-pair random variabl...
research
05/18/2023

A unified framework for information-theoretic generalization bounds

This paper presents a general methodology for deriving information-theor...
research
01/24/2022

Analytic Mutual Information in Bayesian Neural Networks

Bayesian neural networks have successfully designed and optimized a robu...
research
07/10/2019

The Design of Mutual Information

We derive the functional form of mutual information (MI) from a set of d...
research
06/27/2012

Ranking by Dependence - A Fair Criteria

Estimating the dependences between random variables, and ranking them ac...
research
01/19/2023

DiME: Maximizing Mutual Information by a Difference of Matrix-Based Entropies

We introduce an information-theoretic quantity with similar properties t...
research
01/21/2015

A Bayesian alternative to mutual information for the hierarchical clustering of dependent random variables

The use of mutual information as a similarity measure in agglomerative h...

Please sign up or login with your details

Forgot password? Click here to reset