Sharp Bounds for Mutual Covering
Verdú reformulated the covering problem in the non-asymptotic information theoretic setting as a lower bound on the covering probability for any set which has a large probability under a given joint distribution. The covering probability is the probability that there exists a pair of random variables among a given number of independently generated candidates, that falls within the given set. We use a weighted sum trick and Talagrand's concentration inequality to prove new mutual covering bounds. We identify two interesting applications: 1) When the probability of the set under the given joint distribution is bounded away from 0 and 1, the covering probability converges to 1 doubly exponentially fast in the blocklength, which implies that the covering lemma does not induce penalties on the error exponents in the applications to coding theorems. 2) Using Hall's marriage lemma, we show that the maximum difference between the probability of the set under the joint distribution and the covering probability equals half the minimum total variation distance between the joint distribution and any distribution that can be simulated by selecting a pair from the candidates. Thus we use the mutual covering bound to derive the exact error exponent in the joint distribution simulation problem. In both applications, the determination of the exact exponential (or doubly exponential) behavior relies crucially on the sharp concentration inequality used in the proof of the mutual covering lemma.
READ FULL TEXT