On the Role of Entanglement and Statistics in Learning

06/05/2023
by   Srinivasan Arunachalam, et al.
0

In this work we make progress in understanding the relationship between learning models with access to entangled, separable and statistical measurements in the quantum statistical query (QSQ) model. To this end, we show the following results. Entangled versus separable measurements. The goal here is to learn an unknown f from the concept class C⊆{f:{0,1}^n→ [k]} given copies of 1/√(2^n)∑_x | x,f(x)⟩. We show that, if T copies suffice to learn f using entangled measurements, then O(nT^2) copies suffice to learn f using just separable measurements. Entangled versus statistical measurements The goal here is to learn a function f ∈ C given access to separable measurements and statistical measurements. We exhibit a class C that gives an exponential separation between QSQ learning and quantum learning with entangled measurements (even in the presence of noise). This proves the "quantum analogue" of the seminal result of Blum et al. [BKW'03]. that separates classical SQ and PAC learning with classification noise. QSQ lower bounds for learning states. We introduce a quantum statistical query dimension (QSD), which we use to give lower bounds on the QSQ learning. With this we prove superpolynomial QSQ lower bounds for testing purity, shadow tomography, Abelian hidden subgroup problem, degree-2 functions, planted bi-clique states and output states of Clifford circuits of depth (n). Further applications. We give and unconditional separation between weak and strong error mitigation and prove lower bounds for learning distributions in the QSQ model. Prior works by Quek et al. [QFK+'22], Hinsche et al. [HIN+'22], and Nietner et al. [NIS+'23] proved the analogous results assuming diagonal measurements and our work removes this assumption.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset