DeepAI AI Chat
Log In Sign Up

On the Algorithmic Information Between Probabilities

by   Samuel Epstein, et al.

We extend algorithmic conservation inequalities to probability measures. The amount of self information of a probability measure cannot increase when submitted to randomized processing. This includes (potentially non-computable) measures over finite sequences, infinite sequences, and T_0, second countable topologies. One example is the convolution of signals over real numbers with probability kernels. Thus the smoothing of any signal due We show that given a quantum measurement, for an overwhelming majority of pure states, no meaningful information is produced.


page 1

page 2

page 3

page 4


On the Algorithmic Content of Quantum Measurements

We show that given a quantum measurement, for an overwhelming majority o...

Algorithmic No-Cloning Theorem

We introduce the notions of algorithmic mutual information and rarity of...

Generating Randomness from a Computable, Non-random Sequence of Qubits

Nies and Scholz introduced the notion of a state to describe an infinite...

Objective and Subjective Solomonoff Probabilities in Quantum Mechanics

Algorithmic probability has shown some promise in dealing with the proba...

An equivalence between learning of data and probability distributions, and some applications

Algorithmic learning theory traditionally studies the learnability of ef...

Proofs of conservation inequalities for Levin's notion of mutual information of 1974

In this paper we consider Levin's notion of mutual information in infini...

Unnormalized Measures in Information Theory

Information theory is built on probability measures and by definition a ...