Trading information complexity for error II

09/26/2018
by   Yaqiao Li, et al.
0

Consider the standard two-party communication model. This paper, follows up [Y. Dagan, Y. Filmus, H. Hatami, and Y. Li, Trading information complexity for error, Theory of Computing 14(2018), no. 6, 1-73], studies the dependency of information complexity on error allowed to compute a function. Two of our main results on prior-free information complexity are: For both internal and external information complexity, we show to compute any function with error 1/2-ϵ, at least Ω(ϵ^2) and at most O(ϵ) information must be revealed. Both bounds are tight via examples from [M. Braverman and A. Moitra, An information complexity approach to extended formulations, STOC, ACM, 2013]. For external information complexity, we show to compute any function with small error ϵ>0, one can save external information cost, against computing the function without zero, at least Ω(ϵ) and at most O(h(√(ϵ))) where h(·) is the binary entropy function. We do not know either of them is tight. However, when we consider prior-free external information complexity over only product distributions, the lower bound can be improved to Ω(h(ϵ)), with a simpler proof than the one in [Y. Dagan, Y. Filmus, H. Hatami, and Y. Li, Trading information complexity for error, Theory of Computing 14(2018), no. 6, 1-73].

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset