Robust covariance estimation for distributed principal component analysis

10/14/2020
by   Kangqiang Li, et al.
0

Principal component analysis (PCA) is a well-known tool for dimension reduction. It can summarise the data in fewer than the original number of dimensions without losing essential information. However, when data are dispersed across multiple servers, communication cost can't make PCA useful in this situation. Thus distributed algorithms for PCA are needed. Fan et al. [Annals of statistics 47(6) (2019) 3009-3031] proposed a distributed PCA algorithm to settle this problem. On each server, They computed the K leading eigenvectors V_K^(ℓ)=(v_1^(ℓ), …, v_K^(ℓ)) ∈ℝ^d × K of the sample covariance matrix Σ and sent V_K^(ℓ) to the data center. In this paper, we introduce robust covariance matrix estimators respectively proposed by Minsker [Annals of statistics 46(6A) (2018) 2871-2903] and Ke et al. [Statistical Science 34(3) (2019) 454-471] into the distributed PCA algorithm and compute its top K eigenvectors on each server and transmit them to the central server. We investigate the statistical error of the resulting distributed estimator and derive the rate of convergence for distributed PCA estimators for symmetric innovation distribution and general distribution. By simulation study, the theoretical results are verified. Also, we extend our analysis to the heterogeneous case with weaker moments where samples on each server and across servers are independent and their population covariance matrices are different but share the same top K eigenvectors.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset