Asymptotics of Ridge(less) Regression under General Source Condition

06/11/2020
by   Dominic Richards, et al.
2

We analyze the prediction performance of ridge and ridgeless regression when both the number and the dimension of the data go to infinity. In particular, we consider a general setting introducing prior assumptions characterizing "easy" and "hard" learning problems. In this setting, we show that ridgeless (zero regularisation) regression is optimal for easy problems with a high signal to noise. Furthermore, we show that additional descents in the ridgeless bias and variance learning curve can occur beyond the interpolating threshold, verifying recent empirical observations. More generally, we show how a variety of learning curves are possible depending on the problem at hand. From a technical point of view, characterising the influence of prior assumptions requires extending previous applications of random matrix theory to study ridge regression.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/28/2023

Transfer Learning with Random Coefficient Ridge Regression

Ridge regression with random coefficients provides an important alternat...
research
05/29/2023

Generalized equivalences between subsampling and ridge regularization

We establish precise structural and risk equivalences between subsamplin...
research
09/29/2020

Benign overfitting in ridge regression

Classical learning theory suggests that strong regularization is needed ...
research
11/06/2020

Ridge Regression with Frequent Directions: Statistical and Optimization Perspectives

Despite its impressive theory & practical performance, Frequent Directio...
research
04/08/2014

Efficiency of conformalized ridge regression

Conformal prediction is a method of producing prediction sets that can b...
research
07/06/2023

Learning Curves for Heterogeneous Feature-Subsampled Ridge Ensembles

Feature bagging is a well-established ensembling method which aims to re...
research
09/24/2019

Simple and Almost Assumption-Free Out-of-Sample Bound for Random Feature Mapping

Random feature mapping (RFM) is a popular method for speeding up kernel ...

Please sign up or login with your details

Forgot password? Click here to reset