Strong Converses using Typical Changes of Measures and Asymptotic Markov Chains
The paper presents exponentially-strong converses for source-coding, channel coding, and hypothesis testing problems. More specifically, it presents alternative proofs for the well-known exponentially-strong converse bounds for almost lossless source-coding with side-information and for channel coding over a discrete memoryless channel (DMC). These alternative proofs are solely based on a change of measure argument on the sets of conditionally or jointly typical sequences that result in a correct decision, and on the analysis of these measures in the asymptotic regime of infinite blocklengths. The paper also presents a new exponentially-strong converse for the K-hop hypothesis testing against independence problem with certain Markov chains and a strong converse for the two-terminal Lround interactive compression problem with multiple distortion constraints that depend on both sources and both reconstructions. This latter problem includes as special cases the Wyner-Ziv problem, the interactive function computation problem, and the compression with lossy common reconstruction problem. These new strong converse proofs are derived using similar change of measure arguments as described above and by additionally proving that certain Markov chains involving auxiliary random variables hold in the asymptotic regime of infinite blocklengths. As shown in related publications, the same method also yields converse bounds under expected resource constraints.
READ FULL TEXT