Information, learning and falsification

10/17/2011
by   David Balduzzi, et al.
0

There are (at least) three approaches to quantifying information. The first, algorithmic information or Kolmogorov complexity, takes events as strings and, given a universal Turing machine, quantifies the information content of a string as the length of the shortest program producing it. The second, Shannon information, takes events as belonging to ensembles and quantifies the information resulting from observing the given event in terms of the number of alternate events that have been ruled out. The third, statistical learning theory, has introduced measures of capacity that control (in part) the expected risk of classifiers. These capacities quantify the expectations regarding future data that learning algorithms embed into classifiers. This note describes a new method of quantifying information, effective information, that links algorithmic information to Shannon information, and also links both to capacities arising in statistical learning theory. After introducing the measure, we show that it provides a non-universal analog of Kolmogorov complexity. We then apply it to derive basic capacities in statistical learning theory: empirical VC-entropy and empirical Rademacher complexity. A nice byproduct of our approach is an interpretation of the explanatory power of a learning algorithm in terms of the number of hypotheses it falsifies, counted in two different ways for the two capacities. We also discuss how effective information relates to information gain, Shannon and mutual information.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/23/2011

Falsification and future performance

We information-theoretically reformulate two measures of capacity from s...
research
06/29/2019

Kolmogorov's Algorithmic Mutual Information Is Equivalent to Bayes' Law

Given two events A and B, Bayes' law is based on the argument that the p...
research
08/14/2023

The algorithmic second law of thermodynamics

Gács' coarse-grained algorithmic entropy leverages universal computation...
research
09/17/2017

Kolmogorov Complexity and Information Content

In this paper, we revisit a central concept in Kolmogorov complexity in ...
research
02/25/2020

A Theory of Usable Information Under Computational Constraints

We propose a new framework for reasoning about information in complex sy...
research
04/24/2023

Information Theory for Complex Systems Scientists

In the 21st century, many of the crucial scientific and technical issues...
research
02/26/2020

Quantifying daseinisation using Shannon entropy

Topos formalism for quantum mechanics is interpreted in a broader, infor...

Please sign up or login with your details

Forgot password? Click here to reset