DeepAI AI Chat
Log In Sign Up

Fast Ridge Regression with Randomized Principal Component Analysis and Gradient Descent

05/15/2014
by   Yichao Lu, et al.
University of Pennsylvania
0

We propose a new two stage algorithm LING for large scale regression problems. LING has the same risk as the well known Ridge Regression under the fixed design setting and can be computed much faster. Our experiments have shown that LING performs well in terms of both prediction accuracy and computational efficiency compared with other large scale regression algorithms like Gradient Descent, Stochastic Gradient Descent and Principal Component Regression on both simulated and real datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

11/07/2018

A note on the prediction error of principal component regression

We analyse the prediction error of principal component regression (PCR) ...
05/04/2011

A Risk Comparison of Ordinary Least Squares vs Ridge Regression

We compare the risk of ridge regression to a simple variant of ordinary ...
10/27/2021

Distributed Principal Component Analysis with Limited Communication

We study efficient distributed algorithms for the fundamental problem of...
08/16/2016

Faster Principal Component Regression and Stable Matrix Chebyshev Approximation

We solve principal component regression (PCR), up to a multiplicative ac...
06/11/2020

A General Framework for Analyzing Stochastic Dynamics in Learning Algorithms

We present a general framework for analyzing high-probability bounds for...
04/04/2014

Understanding Machine-learned Density Functionals

Kernel ridge regression is used to approximate the kinetic energy of non...