Oblivious algorithms for the Max-kAND Problem

05/08/2023
by   Noah G. Singer, et al.
0

Motivated by recent works on streaming algorithms for constraint satisfaction problems (CSPs), we define and analyze oblivious algorithms for the Max-kAND problem. This generalizes the definition by Feige and Jozeph (Algorithmica '15) of oblivious algorithms for Max-DICUT, a special case of Max-2AND. Oblivious algorithms round each variable with probability depending only on a quantity called the variable's bias. For each oblivious algorithm, we design a so-called "factor-revealing linear program" (LP) which captures its worst-case instance, generalizing one of Feige and Jozeph for Max-DICUT. Then, departing from their work, we perform a fully explicit analysis of these (infinitely many!) LPs. In particular, we show that for all k, oblivious algorithms for Max-kAND provably outperform a special subclass of algorithms we call "superoblivious" algorithms. Our result has implications for streaming algorithms: Generalizing the result for Max-DICUT of Saxena, Singer, Sudan, and Velusamy (SODA'23), we prove that certain separation results hold between streaming models for infinitely many CSPs: for every k, O(log n)-space sketching algorithms for Max-kAND known to be optimal in o(√(n))-space can be beaten in (a) O(log n)-space under a random-ordering assumption, and (b) O(n^1-1/k D^1/k) space under a maximum-degree-D assumption. Even in the previously-known case of Max-DICUT, our analytic proof gives a fuller, computer-free picture of these separation results.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset