Ridge Regression: Structure, Cross-Validation, and Sketching

10/06/2019
by   Sifan Liu, et al.
0

We study the following three fundamental problems about ridge regression: (1) what is the structure of the estimator? (2) how to correctly use cross-validation to choose the regularization parameter? and (3) how to accelerate computation without losing too much accuracy? We consider the three problems in a unified large-data linear model. We give a precise representation of ridge regression as a covariance matrix-dependent linear combination of the true parameter and the noise. We study the bias of K-fold cross-validation for choosing the regularization parameter, and propose a simple bias-correction. We analyze the accuracy of primal and dual sketching for ridge regression, showing they are surprisingly accurate. Our results are illustrated by simulations and by analyzing empirical data.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/29/2020

Group-regularized ridge regression via empirical Bayes noise level cross-validation

Features in predictive models are not exchangeable, yet common supervise...
research
07/19/2021

Can we globally optimize cross-validation loss? Quasiconvexity in ridge regression

Models like LASSO and ridge regression are extensively used in practice ...
research
05/07/2020

Fractional ridge regression: a fast, interpretable reparameterization of ridge regression

Ridge regression (RR) is a regularization technique that penalizes the L...
research
10/17/2022

Bayesian Projection Pursuit Regression

In projection pursuit regression (PPR), an unknown response function is ...
research
09/01/2020

Time-Varying Parameters as Ridge Regressions

Time-varying parameters (TVPs) models are frequently used in economics t...
research
07/20/2022

Provably tuning the ElasticNet across instances

An important unresolved challenge in the theory of regularization is to ...

Please sign up or login with your details

Forgot password? Click here to reset