Information decomposition to identify relevant variation in complex systems with machine learning

07/10/2023
by   Kieran A. Murphy, et al.
0

One of the fundamental steps toward understanding a complex system is identifying variation at the scale of the system's components that is most relevant to behavior on a macroscopic scale. Mutual information is a natural means of linking variation across scales of a system due to its independence of the particular functional relationship between variables. However, estimating mutual information given high-dimensional, continuous-valued data is notoriously difficult, and the desideratum – to reveal important variation in a comprehensible manner – is only readily achieved through exhaustive search. Here we propose a practical, efficient, and broadly applicable methodology to decompose the information contained in a set of measurements by lossily compressing each measurement with machine learning. Guided by the distributed information bottleneck as a learning objective, the information decomposition sorts variation in the measurements of the system state by relevance to specified macroscale behavior, revealing the most important subsets of measurements for different amounts of predictive information. Additional granularity is achieved by inspection of the learned compression schemes: the variation transmitted during compression is composed of distinctions among measurement values that are most relevant to the macroscale behavior. We focus our analysis on two paradigmatic complex systems: a Boolean circuit and an amorphous material undergoing plastic deformation. In both examples, specific bits of entropy are identified out of the high entropy of the system state as most related to macroscale behavior for insight about the connection between micro- and macro- in the complex system. The identification of meaningful variation in data, with the full generality brought by information theory, is made practical for the study of complex systems.

READ FULL TEXT

page 4

page 6

research
01/28/2018

Probability Mass Exclusions and the Directed Components of Pointwise Mutual Information

The pointwise mutual information quantifies the mutual information betwe...
research
12/21/2020

Neural Joint Entropy Estimation

Estimating the entropy of a discrete random variable is a fundamental pr...
research
11/20/2022

Diffeomorphic Information Neural Estimation

Mutual Information (MI) and Conditional Mutual Information (CMI) are mul...
research
11/03/2010

Overcoming Problems in the Measurement of Biological Complexity

In a genetic algorithm, fluctuations of the entropy of a genome over tim...
research
05/09/2022

The Compound Information Bottleneck Outlook

We formulate and analyze the compound information bottleneck programming...
research
10/25/2022

Characterizing information loss in a chaotic double pendulum with the Information Bottleneck

A hallmark of chaotic dynamics is the loss of information with time. Alt...
research
04/15/2022

The Distributed Information Bottleneck reveals the explanatory structure of complex systems

The fruits of science are relationships made comprehensible, often by wa...

Please sign up or login with your details

Forgot password? Click here to reset