Double Descent in Random Feature Models: Precise Asymptotic Analysis for General Convex Regularization

04/06/2022
by   David Bosch, et al.
2

We prove rigorous results on the double descent phenomenon in random features (RF) model by employing the powerful Convex Gaussian Min-Max Theorem (CGMT) in a novel multi-level manner. Using this technique, we provide precise asymptotic expressions for the generalization of RF regression under a broad class of convex regularization terms including arbitrary separable functions. We further compute our results for the combination of ℓ_1 and ℓ_2 regularization case, known as elastic net, and present numerical studies about it. We numerically demonstrate the predictive capacity of our framework, and show experimentally that the predicted test error is accurate even in the non-asymptotic regime.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/13/2023

Precise Asymptotic Analysis of Deep Random Feature Models

We provide exact asymptotic expressions for the performance of regressio...
research
10/13/2021

On the Double Descent of Random Features Models Trained with SGD

We study generalization properties of random features (RF) regression in...
research
08/27/2020

A Precise Performance Analysis of Learning with Random Features

We study the problem of learning an unknown function using random featur...
research
11/13/2019

A Model of Double Descent for High-dimensional Binary Linear Classification

We consider a model for logistic regression where only a subset of featu...
research
11/08/2021

There is no Double-Descent in Random Forests

Random Forests (RFs) are among the state-of-the-art in machine learning ...
research
12/11/2020

Avoiding The Double Descent Phenomenon of Random Feature Models Using Hybrid Regularization

We demonstrate the ability of hybrid regularization methods to automatic...

Please sign up or login with your details

Forgot password? Click here to reset