Fast Ridge Regression with Randomized Principal Component Analysis and Gradient Descent

05/15/2014
by   Yichao Lu, et al.
0

We propose a new two stage algorithm LING for large scale regression problems. LING has the same risk as the well known Ridge Regression under the fixed design setting and can be computed much faster. Our experiments have shown that LING performs well in terms of both prediction accuracy and computational efficiency compared with other large scale regression algorithms like Gradient Descent, Stochastic Gradient Descent and Principal Component Regression on both simulated and real datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/07/2018

A note on the prediction error of principal component regression

We analyse the prediction error of principal component regression (PCR) ...
research
05/04/2011

A Risk Comparison of Ordinary Least Squares vs Ridge Regression

We compare the risk of ridge regression to a simple variant of ordinary ...
research
10/27/2021

Distributed Principal Component Analysis with Limited Communication

We study efficient distributed algorithms for the fundamental problem of...
research
08/16/2016

Faster Principal Component Regression and Stable Matrix Chebyshev Approximation

We solve principal component regression (PCR), up to a multiplicative ac...
research
02/04/2022

Elastic Gradient Descent and Elastic Gradient Flow: LARS Like Algorithms Approximating the Solution Paths of the Elastic Net

The elastic net combines lasso and ridge regression to fuse the sparsity...
research
06/11/2020

A General Framework for Analyzing Stochastic Dynamics in Learning Algorithms

We present a general framework for analyzing high-probability bounds for...
research
04/04/2014

Understanding Machine-learned Density Functionals

Kernel ridge regression is used to approximate the kinetic energy of non...

Please sign up or login with your details

Forgot password? Click here to reset