Resampling Sensitivity of High-Dimensional PCA

12/30/2022
βˆ™
by   Haoyu Wang, et al.
βˆ™
0
βˆ™

The study of stability and sensitivity of statistical methods or algorithms with respect to their data is an important problem in machine learning and statistics. The performance of the algorithm under resampling of the data is a fundamental way to measure its stability and is closely related to generalization or privacy of the algorithm. In this paper, we study the resampling sensitivity for the principal component analysis (PCA). Given an n Γ— p random matrix 𝐗, let 𝐗^[k] be the matrix obtained from 𝐗 by resampling k randomly chosen entries of 𝐗. Let 𝐯 and 𝐯^[k] denote the principal components of 𝐗 and 𝐗^[k]. In the proportional growth regime p/n β†’ΞΎβˆˆ (0,1], we establish the sharp threshold for the sensitivity/stability transition of PCA. When k ≫ n^5/3, the principal components 𝐯 and 𝐯^[k] are asymptotically orthogonal. On the other hand, when k β‰ͺ n^5/3, the principal components 𝐯 and 𝐯^[k] are asymptotically colinear. In words, we show that PCA is sensitive to the input data in the sense that resampling even a negligible portion of the input may completely change the output.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset