Shannon meets von Neumann: A Minimax Theorem for Channel Coding in the Presence of a Jammer
We study the setting of channel coding over a family of channels whose state is controlled by an adversarial jammer by viewing it as a zero-sum game between a finite blocklength encoder-decoder team, and the jammer. The encoder-decoder team choose stochastic encoding and decoding strategies to minimize the average probability of error in transmission, while the jammer chooses a distribution on the state-space to maximize this probability. The min-max value of this game is equivalent to channel coding for a compound channel -- we call this the Shannon solution of the problem. The max-min value corresponds to finding a mixed channel with the largest value of the minimum achievable probability of error. When the min-max and max-min values are equal, the problem is said to admit a saddle-point or von Neumann solution. While a Shannon solution always exists, a von Neumann solution need not, owing to inherent nonconvexity in the communicating team's problem. Despite this, we show that the min-max and max-min values become equal asymptotically in the large blocklength limit, for all but finitely many rates. We explicitly characterize this limiting value as a function of the rate and obtain tight finite blocklength bounds on the min-max and max-min value. As a corollary we get an explicit expression for the ϵ-capacity of a compound channel under stochastic codes -- the first such result, to the best of our knowledge. Our results demonstrate a deeper relation between the compound channel and mixed channel than was previously known. They also show that the conventional information-theoretic viewpoint, articulated via the Shannon solution, coincides asymptotically with the game-theoretic one articulated via the von Neumann solution.
READ FULL TEXT