
How to Use Undiscovered Information Inequalities: Direct Applications of the Copy Lemma
We discuss linear programming techniques that help to deduce corollaries...
read it

Quantum Logarithmic Space and Postselection
Postselection, the power of discarding all runs of a computation in whi...
read it

Almost Global Problems in the LOCAL Model
The landscape of the distributed time complexity is nowadays wellunders...
read it

Precise Expression for the Algorithmic Information Distance
We consider the notion of information distance between two objects x and...
read it

Localized Complexities for Transductive Learning
We show two novel concentration inequalities for suprema of empirical pr...
read it

How to Find New CharacteristicDependent Linear Rank Inequalities using Binary Matrices as a Guide
In Linear Algebra over finite fields, a characteristicdependent linear ...
read it

Dynamic Connected Cooperative Coverage Problem
We study the socalled dynamic coverage problem by agents located in som...
read it
Inequalities for spacebounded Kolmogorov complexity
There is a parallelism between Shannon information theory and algorithmic information theory. In particular, the same linear inequalities are true for Shannon entropies of tuples of random variables and Kolmogorov complexities of tuples of strings (Hammer et al., 1997), as well as for sizes of subgroups and projections of sets (Chan, Yeung, Romashchenko, Shen, Vereshchagin, 1998–2002). This parallelism started with the KolmogorovLevin formula (1968) for the complexity of pairs of strings with logarithmic precision. Longpré (1986) proved a version of this formula for spacebounded complexities. In this paper we prove an improved version of Longpré's result with a tighter space bound, using Sipser's trick (1980). Then, using this space bound, we show that every linear inequality that is true for complexities or entropies, is also true for spacebounded Kolmogorov complexities with a polynomial space overhead.
READ FULL TEXT
Comments
There are no comments yet.