Fluctuation-response theorem for Kullback-Leibler divergences to quantify causation

02/13/2021
by   Andrea Auconi, et al.
0

We define a new measure of causation from a fluctuation-response theorem for Kullback-Leibler divergences, based on the information-theoretic cost of perturbations. This information response has both the invariance properties required for an information-theoretic measure and the physical interpretation of a propagation of perturbations. In linear systems, the information response reduces to the transfer entropy, providing a connection between Fisher and mutual information.

READ FULL TEXT

page 4

page 14

research
02/12/2022

An Information-Theoretic Proof of the Kac–Bernstein Theorem

A short, information-theoretic proof of the Kac–Bernstein theorem, which...
research
10/27/2020

A Probabilistic Representation of Deep Learning for Improving The Information Theoretic Interpretability

In this paper, we propose a probabilistic representation of MultiLayer P...
research
12/06/2017

Quantifying how much sensory information in a neural code is relevant for behavior

Determining how much of the sensory information carried by a neural code...
research
07/10/2011

Information-Theoretic Measures for Objective Evaluation of Classifications

This work presents a systematic study of objective evaluations of abstai...
research
05/29/2019

Where is the Information in a Deep Neural Network?

Whatever information a Deep Neural Network has gleaned from past data is...
research
07/07/2021

Information-theoretic characterization of the complete genotype-phenotype map of a complex pre-biotic world

How information is encoded in bio-molecular sequences is difficult to qu...
research
07/24/2023

On the information-theoretic formulation of network participation

The participation coefficient is a widely used metric of the diversity o...

Please sign up or login with your details

Forgot password? Click here to reset