Kolmogorov's Algorithmic Mutual Information Is Equivalent to Bayes' Law

06/29/2019
by   Fouad B. Chedid, et al.
0

Given two events A and B, Bayes' law is based on the argument that the probability of A given B is proportional to the probability of B given A. When probabilities are interpreted in the Bayesian sense, Bayes' law constitutes a learning algorithm which shows how one can learn from a new observation to improve their belief in a theory that is consistent with that observation. Kolmogorov's notion of algorithmic information, which is based on the theory of algorithms, proposes an objective measure of the amount of information in a finite string about itself and concludes that for any two finite strings x and y, the amount of information in x about y is almost equal to the amount of information in y about x. We view this conclusion of Kolmogorov as the algorithmic information version of Bayes' law. This can be easily demonstrated if one considers the work of Levin on prefix Kolmogorov complexity and then expresses the amount of Kolmogorov mutual information between two finite strings using Solomonoff's a priori probability.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro