Sharp Second-Order Pointwise Asymptotics for Lossless Compression with Side Information

05/21/2020
by   Lampros Gavalakis, et al.
0

The problem of determining the best achievable performance of arbitrary lossless compression algorithms is examined, when correlated side information is available at both the encoder and decoder. For arbitrary source-side information pairs, the conditional information density is shown to provide a sharp asymptotic lower bound for the description lengths achieved by an arbitrary sequence of compressors. This implies that, for ergodic source-side information pairs, the conditional entropy rate is the best achievable asymptotic lower bound to the rate, not just in expectation but with probability one. Under appropriate mixing conditions, a central limit theorem and a law of the iterated logarithm are proved, describing the inevitable fluctuations of the second-order asymptotically best possible rate. An idealised version of Lempel-Ziv coding with side information is shown to be universally first- and second-order asymptotically optimal, under the same conditions. These results are in part based on a new almost-sure invariance principle for the conditional information density, which may be of independent interest.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/12/2019

Fundamental Limits of Lossless Data Compression with Side Information

The problem of lossless data compression with side information available...
research
12/15/2021

Encoding Individual Source Sequences for the Wiretap Channel

We consider the problem of encoding a deterministic source sequence (a.k...
research
06/24/2020

Second order asymptotic efficiency for a Poisson process

We consider the problem of the estimation of the mean function of an inh...
research
04/11/2022

Why Shape Coding? Asymptotic Analysis of the Entropy Rate for Digital Images

This paper focuses on the ultimate limit theory of image compression. It...
research
05/20/2019

On the Second-Order Asymptotics of the Partially Smoothed Conditional Min-Entropy & Application to Quantum Compression

Recently, Anshu et al. introduced "partially" smoothed information measu...
research
10/22/2018

On the Conditional Smooth Renyi Entropy and its Applications in Guessing and Source Coding

A novel definition of the conditional smooth Renyi entropy, which is dif...
research
03/10/2022

Entropy Rate Bounds via Second-Order Statistics

This work contains two single-letter upper bounds on the entropy rate of...

Please sign up or login with your details

Forgot password? Click here to reset