
Groupregularized ridge regression via empirical Bayes noise level crossvalidation
Features in predictive models are not exchangeable, yet common supervise...
read it

On the Optimal Weighted ℓ_2 Regularization in Overparameterized Linear Regression
We consider the linear model 𝐲 = 𝐗β_⋆ + ϵ with 𝐗∈ℝ^n× p in the overparam...
read it

Can we globally optimize crossvalidation loss? Quasiconvexity in ridge regression
Models like LASSO and ridge regression are extensively used in practice ...
read it

Fractional ridge regression: a fast, interpretable reparameterization of ridge regression
Ridge regression (RR) is a regularization technique that penalizes the L...
read it

TimeVarying Parameters as Ridge Regressions
Timevarying parameters (TVPs) models are frequently used in economics t...
read it

Asymptotics of Ridge(less) Regression under General Source Condition
We analyze the prediction performance of ridge and ridgeless regression ...
read it

On the Asymptotic Optimality of CrossValidation based Hyperparameter Estimators for Regularized Least Squares Regression Problems
The asymptotic optimality (a.o.) of various hyperparameter estimators w...
read it
Ridge Regression: Structure, CrossValidation, and Sketching
We study the following three fundamental problems about ridge regression: (1) what is the structure of the estimator? (2) how to correctly use crossvalidation to choose the regularization parameter? and (3) how to accelerate computation without losing too much accuracy? We consider the three problems in a unified largedata linear model. We give a precise representation of ridge regression as a covariance matrixdependent linear combination of the true parameter and the noise. We study the bias of Kfold crossvalidation for choosing the regularization parameter, and propose a simple biascorrection. We analyze the accuracy of primal and dual sketching for ridge regression, showing they are surprisingly accurate. Our results are illustrated by simulations and by analyzing empirical data.
READ FULL TEXT
Comments
There are no comments yet.