Random Access in Distributed Source Coding
The lossless compression of a single source X^n was recently shown to be achievable with a notion of strong locality; any X_i can be decoded from a constant number of compressed bits, with a vanishing in n probability of error. In contrast with the single source setup, we show that for two separately encoded sources (X^n,Y^n), lossless compression and strong locality is generally not possible. More precisely, we show that for the class of "confusable" sources strong locality cannot be achieved whenever one of the sources is compressed below its entropy. In this case, irrespectively of n, the probability of error of decoding any (X_i,Y_i) is lower bounded by 2^-O(d_loc), where d_loc denotes the number of compressed bits accessed by the local decoder. Conversely, if the source is not confusable, strong locality is possible even if one of the sources is compressed below its entropy. Results extend to any number of sources.
READ FULL TEXT