A Risk Comparison of Ordinary Least Squares vs Ridge Regression

05/04/2011
by   Paramveer S. Dhillon, et al.
0

We compare the risk of ridge regression to a simple variant of ordinary least squares, in which one simply projects the data onto a finite dimensional subspace (as specified by a Principal Component Analysis) and then performs an ordinary (un-regularized) least squares regression in this subspace. This note shows that the risk of this ordinary least squares method is within a constant factor (namely 4) of the risk of ridge regression.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/15/2014

Fast Ridge Regression with Randomized Principal Component Analysis and Gradient Descent

We propose a new two stage algorithm LING for large scale regression pro...
research
03/26/2020

Robust Least Squares for Quantized Data

In this paper we formulate and solve a robust least squares problem for ...
research
08/28/2018

Making ordinary least squares linear classfiers more robust

In the field of statistics and machine learning, the sums-of-squares, co...
research
09/04/2022

Orthogonal and Linear Regressions and Pencils of Confocal Quadrics

We develop further and enhance bridges between three disciplines: statis...
research
06/07/2015

No penalty no tears: Least squares in high-dimensional linear models

Ordinary least squares (OLS) is the default method for fitting linear mo...
research
06/13/2011

Random design analysis of ridge regression

This work gives a simultaneous analysis of both the ordinary least squar...
research
04/07/2018

A group-based approach to the least squares regression for handling multicollinearity from strongly correlated variables

Multicollinearity due to strongly correlated predictor variables is a lo...

Please sign up or login with your details

Forgot password? Click here to reset