Variable-Length Source Dispersions Differ under Maximum and Average Error Criteria
Variable-length compression without prefix-free constraints and with side-information available at both encoder and decoder is considered. Instead of requiring the code to be error-free, we allow for it to have a non-vanishing error probability. We derive one-shot bounds on the optimal average codeword length by proposing two new information quantities; namely, the conditional and unconditional ε-cutoff entropies. Using these one-shot bounds, we obtain the second-order asymptotics of the problem under two different formalisms—the average and maximum probabilities of error over the realization of the side-information. While the first-order terms in the asymptotic expansions for both formalisms are identical, we find that the source dispersion under the average error formalism is, in most cases, strictly smaller than its maximum counterpart. Applications to a certain class of guessing problems, previously studied by Kuzuoka (2019), are also discussed.
READ FULL TEXT