
When Is Amplification Necessary for Composition in Randomized Query Complexity?
Suppose we have randomized decision trees for an outer function f and an...
read it

On parity decision trees for Fouriersparse Boolean functions
We study parity decision trees for Boolean functions. The motivation of ...
read it

ClassicalQuantum Separations in Certain Classes of Boolean Functions– Analysis using the Parity Decision Trees
In this paper we study the separation between the deterministic (classic...
read it

Testing and reconstruction via decision trees
We study sublinear and local computation algorithms for decision trees, ...
read it

Exact Quantum Query Algorithms Outperforming Parity – Beyond The Symmetric functions
The Exact Quantum Query model is the least explored query model, and alm...
read it

Approximating Pandora's Box with Correlations
The Pandora's Box problem asks to find a search strategy over n alternat...
read it

Decision Trees for Complexity Reduction in Video Compression
This paper proposes a method for complexity reduction in practical video...
read it
Fourier Growth of Parity Decision Trees
We prove that for every parity decision tree of depth d on n variables, the sum of absolute values of Fourier coefficients at level ℓ is at most d^ℓ/2· O(ℓ·log(n))^ℓ. Our result is nearly tight for small values of ℓ and extends a previous Fourier bound for standard decision trees by Sherstov, Storozhenko, and Wu (STOC, 2021). As an application of our Fourier bounds, using the results of Bansal and Sinha (STOC, 2021), we show that the kfold Forrelation problem has (randomized) parity decision tree complexity Ω̃(n^11/k), while having quantum query complexity ⌈ k/2⌉. Our proof follows a randomwalk approach, analyzing the contribution of a random path in the decision tree to the levelℓ Fourier expression. To carry the argument, we apply a careful cleanup procedure to the parity decision tree, ensuring that the value of the random walk is bounded with high probability. We observe that step sizes for the levelℓ walks can be computed by the intermediate values of level ≤ℓ1 walks, which calls for an inductive argument. Our approach differs from previous proofs of Tal (FOCS, 2020) and Sherstov, Storozhenko, and Wu (STOC, 2021) that relied on decompositions of the tree. In particular, for the special case of standard decision trees we view our proof as slightly simpler and more intuitive. In addition, we prove a similar bound for noisy decision trees of cost at most d – a model that was recently introduced by BenDavid and Blais (FOCS, 2020).
READ FULL TEXT
Comments
There are no comments yet.