Polyadic Entropy, Synergy and Redundancy among Statistically Independent Processes in Nonlinear Statistical Physics with Microphysical Codependence

12/05/2017
by   Rui A. P. Perdigão, et al.
0

The information shared among observables representing processes of interest is traditionally evaluated in terms of macroscale measures characterizing aggregate properties of the underlying processes and their interactions. Traditional information measures are grounded on the assumption that the observable represents a memoryless process without any interaction among microstates. Generalized entropy measures have been formulated in non-extensive statistical mechanics aiming to take microphysical codependence into account in entropy quantification. By taking them into consideration when formulating information measures, the question is raised on whether and if so how much information permeates across scales to impact on the macroscale information measures. The present study investigates and quantifies the emergence of macroscale information from microscale codependence among microphysics. In order to isolate the information emergence coming solely from the nonlinearly interacting microphysics, redundancy and synergy are evaluated among macroscale variables that are statistically independent from each other but not necessarily so within their own microphysics. Synergistic and redundant information are found when microphysical interactions take place, even if the statistical distributions are factorable. These findings stress the added value of nonlinear statistical physics to information theory in coevolutionary systems.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/13/2017

Quantifying multivariate redundancy with maximum entropy decompositions of mutual information

Williams and Beer (2010) proposed a nonnegative mutual information decom...
research
05/13/2018

Kolmogorov-Sinai entropy and dissipation in driven classical Hamiltonian systems

A central concept in the connection between physics and information theo...
research
01/05/2019

A Scale-invariant Generalization of Renyi Entropy and Related Optimizations under Tsallis' Nonextensive Framework

Entropy and cross-entropy are two very fundamental concepts in informati...
research
02/28/2019

Quantifying High-order Interdependencies via Multivariate Extensions of the Mutual Information

This article introduces a model-agnostic approach to study statistical s...
research
08/06/2021

Independence Properties of Generalized Submodular Information Measures

Recently a class of generalized information measures was defined on sets...
research
11/12/2021

Generalized active information: extensions to unbounded domains

In the last three decades, several measures of complexity have been prop...
research
09/03/2021

The typical set and entropy in stochastic systems with arbitrary phase space growth

The existence of the typical set is key for the consistence of the ensem...

Please sign up or login with your details

Forgot password? Click here to reset