Optimal Spectral Recovery of a Planted Vector in a Subspace

05/31/2021
by   Cheng Mao, et al.
0

Recovering a planted vector v in an n-dimensional random subspace of ℝ^N is a generic task related to many problems in machine learning and statistics, such as dictionary learning, subspace recovery, and principal component analysis. In this work, we study computationally efficient estimation and detection of a planted vector v whose ℓ_4 norm differs from that of a Gaussian vector with the same ℓ_2 norm. For instance, in the special case of an N ρ-sparse vector v with Rademacher nonzero entries, our results include the following: (1) We give an improved analysis of (a slight variant of) the spectral method proposed by Hopkins, Schramm, Shi, and Steurer, showing that it approximately recovers v with high probability in the regime n ρ≪√(N). In contrast, previous work required either ρ≪ 1/√(n) or n √(ρ)≲√(N) for polynomial-time recovery. Our result subsumes both of these conditions (up to logarithmic factors) and also treats the dense case ρ = 1 which was not previously considered. (2) Akin to ℓ_∞ bounds for eigenvector perturbation, we establish an entrywise error bound for the spectral estimator via a leave-one-out analysis, from which it follows that thresholding recovers v exactly. (3) We study the associated detection problem and show that in the regime n ρ≫√(N), any spectral method from a large class (and more generally, any low-degree polynomial of the input) fails to detect the planted vector. This establishes optimality of our upper bounds and offers evidence that no polynomial-time algorithm can succeed when n ρ≫√(N).

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/08/2022

Entrywise Recovery Guarantees for Sparse PCA via Sparsistent Algorithms

Sparse Principal Component Analysis (PCA) is a prevalent tool across a p...
research
11/20/2013

Sparse PCA via Covariance Thresholding

In sparse principal component analysis we are given noisy observations o...
research
07/26/2019

Subexponential-Time Algorithms for Sparse PCA

We study the computational cost of recovering a unit-norm sparse princip...
research
12/15/2014

Finding a sparse vector in a subspace: Linear sparsity using alternating directions

Is it possible to find the sparsest vector (direction) in a generic subs...
research
12/01/2019

On the optimality of kernels for high-dimensional clustering

This paper studies the optimality of kernel methods in high-dimensional ...
research
07/06/2014

Dictionary Learning and Tensor Decomposition via the Sum-of-Squares Method

We give a new approach to the dictionary learning (also known as "sparse...
research
02/19/2019

Computational Hardness of Certifying Bounds on Constrained PCA Problems

Given a random n × n symmetric matrix W drawn from the Gaussian orthogo...

Please sign up or login with your details

Forgot password? Click here to reset