Query-to-Communication Lifting Using Low-Discrepancy Gadgets

04/30/2019
by   Arkadev Chattopadhyay, et al.
0

Lifting theorems are theorems that relate the query complexity of a function f:{0,1}^n→{0,1} to the communication complexity of the composed function f ∘ g^n, for some "gadget" g:{0,1}^b×{0,1}^b→{0,1}. Such theorems allow transferring lower bounds from query complexity to the communication complexity, and have seen numerous applications in the recent years. In addition, such theorems can be viewed as a strong generalization of a direct-sum theorem for the gadget g. We prove a new lifting theorem that works for all gadgets g that have logarithmic length and exponentially-small discrepancy, for both deterministic and randomized communication complexity. Thus, we significantly increase the range of gadgets for which such lifting theorems hold. Our result has two main motivations: First, allowing a larger variety of gadgets may support more applications. In particular, our work is the first to prove a randomized lifting theorem for logarithmic-size gadgets, thus improving some applications of the theorem. Second, our result can be seen as a strong generalization of a direct-sum theorem for functions with low discrepancy.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset
Success!
Error Icon An error occurred

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro