Optimal Spectral Recovery of a Planted Vector in a Subspace
Recovering a planted vector v in an n-dimensional random subspace of ℝ^N is a generic task related to many problems in machine learning and statistics, such as dictionary learning, subspace recovery, and principal component analysis. In this work, we study computationally efficient estimation and detection of a planted vector v whose ℓ_4 norm differs from that of a Gaussian vector with the same ℓ_2 norm. For instance, in the special case of an N ρ-sparse vector v with Rademacher nonzero entries, our results include the following: (1) We give an improved analysis of (a slight variant of) the spectral method proposed by Hopkins, Schramm, Shi, and Steurer, showing that it approximately recovers v with high probability in the regime n ρ≪√(N). In contrast, previous work required either ρ≪ 1/√(n) or n √(ρ)≲√(N) for polynomial-time recovery. Our result subsumes both of these conditions (up to logarithmic factors) and also treats the dense case ρ = 1 which was not previously considered. (2) Akin to ℓ_∞ bounds for eigenvector perturbation, we establish an entrywise error bound for the spectral estimator via a leave-one-out analysis, from which it follows that thresholding recovers v exactly. (3) We study the associated detection problem and show that in the regime n ρ≫√(N), any spectral method from a large class (and more generally, any low-degree polynomial of the input) fails to detect the planted vector. This establishes optimality of our upper bounds and offers evidence that no polynomial-time algorithm can succeed when n ρ≫√(N).
READ FULL TEXT