Faster Principal Component Regression and Stable Matrix Chebyshev Approximation

08/16/2016
by   Zeyuan Allen-Zhu, et al.
0

We solve principal component regression (PCR), up to a multiplicative accuracy 1+γ, by reducing the problem to Õ(γ^-1) black-box calls of ridge regression. Therefore, our algorithm does not require any explicit construction of the top principal components, and is suitable for large-scale PCR instances. In contrast, previous result requires Õ(γ^-2) such black-box calls. We obtain this result by developing a general stable recurrence formula for matrix Chebyshev polynomials, and a degree-optimal polynomial approximation to the matrix sign function. Our techniques may be of independent interests, especially when designing iterative methods.

READ FULL TEXT
research
02/22/2016

Principal Component Projection Without Principal Component Analysis

We show how to efficiently project a vector onto the top principal compo...
research
05/15/2014

Fast Ridge Regression with Randomized Principal Component Analysis and Gradient Descent

We propose a new two stage algorithm LING for large scale regression pro...
research
01/04/2023

A note on the variance in principal component regression

Principal component regression is a popular method to use when the predi...
research
12/20/2013

The Sparse Principal Component of a Constant-rank Matrix

The computation of the sparse principal component of a matrix is equival...
research
10/15/2019

Principal Component Projection and Regression in Nearly Linear Time through Asymmetric SVRG

Given a data matrix A∈R^n × d, principal component projection (PCP) and ...
research
04/30/2020

The Canny-Emiris conjecture for the sparse resultant

We present a product formula for the initial parts of the sparse resulta...
research
06/26/2021

Who Votes for Library Bonds? A Principal Component Exploration

Previous research has shown a relationship between voter characteristics...

Please sign up or login with your details

Forgot password? Click here to reset