Learning from Complex Systems: On the Roles of Entropy and Fisher Information in Pairwise Isotropic Gaussian Markov Random Fields

08/25/2011
by   Alexandre L. M. Levada, et al.
0

Markov Random Field models are powerful tools for the study of complex systems. However, little is known about how the interactions between the elements of such systems are encoded, especially from an information-theoretic perspective. In this paper, our goal is to enlight the connection between Fisher information, Shannon entropy, information geometry and the behavior of complex systems modeled by isotropic pairwise Gaussian Markov random fields. We propose analytical expressions to compute local and global versions of these measures using Besag's pseudo-likelihood function, characterizing the system's behavior through its Fisher curve, a parametric trajectory accross the information space that provides a geometric representation for the study of complex systems. Computational experiments show how the proposed tools can be useful in extrating relevant information from complex patterns. The obtained results quantify and support our main conclusion, which is: in terms of information, moving towards higher entropy states (A --> B) is different from moving towards lower entropy states (B --> A), since the Fisher curves are not the same given a natural orientation (the direction of time).

READ FULL TEXT

page 21

page 22

page 23

page 24

page 25

page 26

page 35

research
11/06/2021

Geodesic curves in Gaussian random field manifolds

Random fields are mathematical structures used to model the spatial inte...
research
01/31/2022

The Curvature Effect in Gaussian Random Fields

Random field models are mathematical structures used in the study of sto...
research
03/24/2022

On the Kullback-Leibler divergence between pairwise isotropic Gaussian-Markov random fields

The Kullback-Leibler divergence or relative entropy is an information-th...
research
05/28/2020

On Functions of Markov Random Fields

We derive two sufficient conditions for a function of a Markov random fi...
research
07/10/2018

Understanding VAEs in Fisher-Shannon Plane

In information theory, Fisher information and Shannon information (entro...
research
03/13/2021

Renyi Entropy of Multivariate Autoregressive Moving Average Control Systems

The Renyi entropy is an important measure of the information, it is prop...
research
06/12/2022

A Functional Information Perspective on Model Interpretation

Contemporary predictive models are hard to interpret as their deep nets ...

Please sign up or login with your details

Forgot password? Click here to reset