On the Algorithmic Information Between Probabilities

03/13/2023
by   Samuel Epstein, et al.
0

We extend algorithmic conservation inequalities to probability measures. The amount of self information of a probability measure cannot increase when submitted to randomized processing. This includes (potentially non-computable) measures over finite sequences, infinite sequences, and T_0, second countable topologies. One example is the convolution of signals over real numbers with probability kernels. Thus the smoothing of any signal due We show that given a quantum measurement, for an overwhelming majority of pure states, no meaningful information is produced.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/07/2021

On the Algorithmic Content of Quantum Measurements

We show that given a quantum measurement, for an overwhelming majority o...
research
08/09/2018

Algorithmic No-Cloning Theorem

We introduce the notions of algorithmic mutual information and rarity of...
research
05/01/2020

Generating Randomness from a Computable, Non-random Sequence of Qubits

Nies and Scholz introduced the notion of a state to describe an infinite...
research
07/30/2018

Objective and Subjective Solomonoff Probabilities in Quantum Mechanics

Algorithmic probability has shown some promise in dealing with the proba...
research
01/05/2018

An equivalence between learning of data and probability distributions, and some applications

Algorithmic learning theory traditionally studies the learnability of ef...
research
11/13/2019

Proofs of conservation inequalities for Levin's notion of mutual information of 1974

In this paper we consider Levin's notion of mutual information in infini...
research
02/06/2022

Unnormalized Measures in Information Theory

Information theory is built on probability measures and by definition a ...

Please sign up or login with your details

Forgot password? Click here to reset