Exact mutual information for lognormal random variables

06/23/2023
by   Maurycy Chwiłka, et al.
0

Stochastic correlated observables with lognormal distribution are ubiquitous in nature, and hence they deserve an exact information-theoretic characterization. We derive a general analytical formula for mutual information between vectors of lognormally distributed random variables, and provide lower and upper bounds on its value. That formula and its bounds involve determinants and traces of high dimensional covariance matrices of these variables. Exact explicit forms of mutual information are calculated for some special cases and types of correlations. As an example, we provide an analytic formula for mutual information between neurons, relevant for neural networks in the brain.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro