Sparse PCA via Covariance Thresholding

11/20/2013
by   Yash Deshpande, et al.
0

In sparse principal component analysis we are given noisy observations of a low-rank matrix of dimension n× p and seek to reconstruct it under additional sparsity assumptions. In particular, we assume here each of the principal components v_1,...,v_r has at most s_0 non-zero entries. We are particularly interested in the high dimensional regime wherein p is comparable to, or even much larger than n. In an influential paper, johnstone2004sparse introduced a simple algorithm that estimates the support of the principal vectors v_1,...,v_r by the largest entries in the diagonal of the empirical covariance. This method can be shown to identify the correct support with high probability if s_0< K_1√(n/ p), and to fail with high probability if s_0> K_2 √(n/ p) for two constants 0<K_1,K_2<∞. Despite a considerable amount of work over the last ten years, no practical algorithm exists with provably better support recovery guarantees. Here we analyze a covariance thresholding algorithm that was recently proposed by KrauthgamerSPCA. On the basis of numerical simulations (for the rank-one case), these authors conjectured that covariance thresholding correctly recover the support with high probability for s_0< K√(n) (assuming n of the same order as p). We prove this conjecture, and in fact establish a more general guarantee including higher-rank as well as n much smaller than p. Recent lower bounds berthet2013computational, ma2015sum suggest that no polynomial time algorithm can do significantly better. The key technical component of our analysis develops new bounds on the norm of kernel random matrices, in regimes that were not considered before.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/18/2022

Meta Sparse Principal Component Analysis

We study the meta-learning for support (i.e. the set of non-zero entries...
research
06/16/2013

Do semidefinite relaxations solve sparse PCA up to the information limit?

Estimating the leading principal components of data, assuming they are s...
research
05/31/2021

Optimal Spectral Recovery of a Planted Vector in a Subspace

Recovering a planted vector v in an n-dimensional random subspace of ℝ^N...
research
06/23/2020

Approximation Algorithms for Sparse Principal Component Analysis

We present three provably accurate, polynomial time, approximation algor...
research
08/14/2021

On Support Recovery with Sparse CCA: Information Theoretic and Computational Limits

In this paper we consider asymptotically exact support recovery in the c...
research
02/03/2023

Support Recovery in Sparse PCA with Non-Random Missing Data

We analyze a practical algorithm for sparse PCA on incomplete and noisy ...
research
01/23/2017

Iterative Thresholding for Demixing Structured Superpositions in High Dimensions

We consider the demixing problem of two (or more) high-dimensional vecto...

Please sign up or login with your details

Forgot password? Click here to reset