Strong Converse for Hypothesis Testing Against Independence Over A Noisy Channel

06/04/2020 ∙ by Daming Cao, et al. ∙ 0

We revisit the hypothesis testing problem against independence over a noisy channel and prove a strong converse theorem. In particular, under the Neyman-Pearson formulation, we derive a non-asymptotic upper bound on the type-II exponent of any encoding-decoding functions which can ensure that the type-I error probability is upper bounded by a constant. The strong converse theorem for the problem follows as a corollary as our result. Our proof is based on the recently proposed strong converse technique by Tyagi and Watanabe (TIT 2020) which is based on the change of measure technique. Our work is the first application of the strong converse technique by Tyagi and Watanabe to a hypothesis testing problem over a noisy channel and thus further demonstrates the generality of the technique.



There are no comments yet.


page 1

page 2

page 3

page 4

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.