DeepAI AI Chat
Log In Sign Up

Principal Component Projection and Regression in Nearly Linear Time through Asymmetric SVRG

by   Yujia Jin, et al.

Given a data matrix A∈R^n × d, principal component projection (PCP) and principal component regression (PCR), i.e. projection and regression restricted to the top-eigenspace of A, are fundamental problems in machine learning, optimization, and numerical analysis. In this paper we provide the first algorithms that solve these problems in nearly linear time for fixed eigenvalue distribution and large n. This improves upon previous methods which have superlinear running times when both the number of top eigenvalues and inverse gap between eigenspaces is large. We achieve our results by applying rational approximations to reduce PCP and PCR to solving asymmetric linear systems which we solve by a variant of SVRG. We corroborate these findings with preliminary empirical experiments.


page 1

page 2

page 3

page 4


Principal Component Projection Without Principal Component Analysis

We show how to efficiently project a vector onto the top principal compo...

A note on the prediction error of principal component regression in high dimensions

We analyze the prediction error of principal component regression (PCR) ...

A note on the variance in principal component regression

Principal component regression is a popular method to use when the predi...

Sketching for Principal Component Regression

Principal component regression (PCR) is a useful method for regularizing...

Essential Number of Principal Components and Nearly Training-Free Model for Spectral Analysis

Through a study of multi-gas mixture datasets, we show that in multi-com...

Quantum-Inspired Classical Algorithm for Principal Component Regression

This paper presents a sublinear classical algorithm for principal compon...

Faster Principal Component Regression and Stable Matrix Chebyshev Approximation

We solve principal component regression (PCR), up to a multiplicative ac...