Distributed Hypothesis Testing over a Noisy Channel: Error-exponents Trade-off

08/21/2019
by   Sreejith Sreekumar, et al.
0

A distributed hypothesis testing problem with two parties, one referred to as the observer and the other as the detector, is considered. The observer observes a discrete memoryless source and communicates its observations to the detector over a discrete memoryless noisy channel. The detector observes a side-information correlated with the observer's observations, and performs a binary hypothesis test on the joint probability distribution of its own observations with that of the observer. With the objective of characterizing the performance of the hypothesis test, we obtain two inner bounds on the trade-off between the exponents of the type I and type II error probabilities. The first inner bound is obtained using a combination of a type-based quantize-bin scheme and Borade et al.'s unequal error protection scheme, while the second inner bound is established using a type-based hybrid coding scheme. These bounds extend the achievability result of Han and Kobayashi obtained for the special case of a rate-limited noiseless channel to a noisy channel. For the special case of testing for the marginal distribution of the observer's observations with no side-information at the detector, we establish a single-letter characterization of the optimal trade-off between the type I and type II error-exponents. Our results imply that a separation holds in this case, in the sense that the optimal trade-off between the error-exponents is achieved by a scheme that performs independent hypothesis testing and channel coding.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset