DeepAI AI Chat
Log In Sign Up

Faster Principal Component Regression and Stable Matrix Chebyshev Approximation

by   Zeyuan Allen-Zhu, et al.

We solve principal component regression (PCR), up to a multiplicative accuracy 1+γ, by reducing the problem to Õ(γ^-1) black-box calls of ridge regression. Therefore, our algorithm does not require any explicit construction of the top principal components, and is suitable for large-scale PCR instances. In contrast, previous result requires Õ(γ^-2) such black-box calls. We obtain this result by developing a general stable recurrence formula for matrix Chebyshev polynomials, and a degree-optimal polynomial approximation to the matrix sign function. Our techniques may be of independent interests, especially when designing iterative methods.


Principal Component Projection Without Principal Component Analysis

We show how to efficiently project a vector onto the top principal compo...

Fast Ridge Regression with Randomized Principal Component Analysis and Gradient Descent

We propose a new two stage algorithm LING for large scale regression pro...

A note on the variance in principal component regression

Principal component regression is a popular method to use when the predi...

The Sparse Principal Component of a Constant-rank Matrix

The computation of the sparse principal component of a matrix is equival...

Principal Component Projection and Regression in Nearly Linear Time through Asymmetric SVRG

Given a data matrix A∈R^n × d, principal component projection (PCP) and ...

The Canny-Emiris conjecture for the sparse resultant

We present a product formula for the initial parts of the sparse resulta...

Who Votes for Library Bonds? A Principal Component Exploration

Previous research has shown a relationship between voter characteristics...