Channel Simulation: Finite Blocklengths and Broadcast Channels
We study channel simulation under common randomness-assistance in the finite-blocklength regime and identify the smooth channel max-information as a linear program one-shot converse on the minimal simulation cost for fixed error tolerance. We show that this one-shot converse can be achieved exactly using no-signaling assisted codes, and approximately achieved using common randomness-assisted codes. Our one-shot converse thus takes on an analogous role to the celebrated meta-converse in the complementary problem of channel coding. We asymptotically expand our bounds on the simulation cost for discrete memoryless channels, leading to the second-order as well as the moderate deviation rate expansion. These can be expressed in terms of the channel capacity and channel dispersion known from noisy channel coding. Our bounds then imply the well-known fact that the optimal asymptotic first-order rate of one channel to simulate another under common randomness-assistance is given by the ratio of their respective capacities. Additionally, our higher-order asymptotic expansion shows that this reversibility falls apart in second-order. Our techniques extend to discrete memoryless broadcast channels. In stark contrast to the elusive broadcast channel capacity problem, we show that the reverse problem of broadcast channel simulation under common randomness-assistance allows for an efficiently computable single-letter characterization of the asymptotic rate region in terms of the channel's multi-partite mutual information. We present an Blahut-Arimoto type algorithm to compute the rate region efficiently. This finding together with standard bounds on the broadcast channel capacity then imply that channel inter-conversion under common randomness-assistance is asymptotically irreversible.
READ FULL TEXT