Find the dimension that counts: Fast dimension estimation and Krylov PCA

10/08/2018
by   Shashanka Ubaru, et al.
0

High dimensional data and systems with many degrees of freedom are often characterized by covariance matrices. In this paper, we consider the problem of simultaneously estimating the dimension of the principal (dominant) subspace of these covariance matrices and obtaining an approximation to the subspace. This problem arises in the popular principal component analysis (PCA), and in many applications of machine learning, data analysis, signal and image processing, and others. We first present a novel method for estimating the dimension of the principal subspace. We then show how this method can be coupled with a Krylov subspace method to simultaneously estimate the dimension and obtain an approximation to the subspace. The dimension estimation is achieved at no additional cost. The proposed method operates on a model selection framework, where the novel selection criterion is derived based on random matrix perturbation theory ideas. We present theoretical analyses which (a) show that the proposed method achieves strong consistency (i.e., yields optimal solution as the number of data-points n→∞), and (b) analyze conditions for exact dimension estimation in the finite n case. Using recent results, we show that our algorithm also yields near optimal PCA. The proposed method avoids forming the sample covariance matrix (associated with the data) explicitly and computing the complete eigen-decomposition. Therefore, the method is inexpensive, which is particularly advantageous in modern data applications where the covariance matrices can be very large. Numerical experiments illustrate the performance of the proposed method in various applications.

READ FULL TEXT
research
04/29/2020

A generalized information criterion for high-dimensional PCA rank selection

Principal component analysis (PCA) is the most commonly used statistical...
research
03/16/2016

Near-Optimal Stochastic Approximation for Online Principal Component Estimation

Principal component analysis (PCA) has been a prominent tool for high-di...
research
03/20/2018

Data Distillery: Effective Dimension Estimation via Penalized Probabilistic PCA

The paper tackles the unsupervised estimation of the effective dimension...
research
05/16/2010

On the Subspace of Image Gradient Orientations

We introduce the notion of Principal Component Analysis (PCA) of image g...
research
09/26/2019

Estimating covariance and precision matrices along subspaces

We study the accuracy of estimating the covariance and the precision mat...
research
05/31/2022

coVariance Neural Networks

Graph neural networks (GNN) are an effective framework that exploit inte...
research
09/26/2018

Bayesian inference for PCA and MUSIC algorithms with unknown number of sources

Principal component analysis (PCA) is a popular method for projecting da...

Please sign up or login with your details

Forgot password? Click here to reset