Learning Feature Sparse Principal Components

04/23/2019
by   Lai Tian, et al.
0

Sparse PCA has shown its effectiveness in high dimensional data analysis, while there is still a gap between the computational method and statistical theory. This paper presents algorithms to solve the row-sparsity constrained PCA, named Feature Sparse PCA (FSPCA), which performs feature selection and PCA simultaneously. Existing techniques to solve the FSPCA problem suffer two main drawbacks: (1) most approaches only solve the leading eigenvector and rely on the deflation technique to estimate the leading m eigenspace, which has feature sparsity inconsistence, identifiability, and orthogonality issues; (2) some approaches are heuristics without convergence guarantee. In this paper, we present convergence guaranteed algorithms to directly estimate the leading m eigenspace. In detail, we show for a low rank covariance matrix, the FSPCA problem can be solved globally (Algorithm 1). Then, we propose an algorithm (Algorithm 2) to solve the FSPCA for general covariance by iteratively building a carefully designed low rank proxy covariance. Theoretical analysis gives the convergence guarantee. Experimental results show the promising performance of the new algorithms compared with the state-of-the-art method on both synthetic and real-world datasets.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset