Gradient flow in the gaussian covariate model: exact solution of learning curves and multiple descent structures

12/13/2022
by   Antoine Bodin, et al.
0

A recent line of work has shown remarkable behaviors of the generalization error curves in simple learning models. Even the least-squares regression has shown atypical features such as the model-wise double descent, and further works have observed triple or multiple descents. Another important characteristic are the epoch-wise descent structures which emerge during training. The observations of model-wise and epoch-wise descents have been analytically derived in limited theoretical settings (such as the random feature model) and are otherwise experimental. In this work, we provide a full and unified analysis of the whole time-evolution of the generalization curve, in the asymptotic large-dimensional regime and under gradient-flow, within a wider theoretical setting stemming from a gaussian covariate model. In particular, we cover most cases already disparately observed in the literature, and also provide examples of the existence of multiple descent structures as a function of a model parameter or time. Furthermore, we show that our theoretical predictions adequately match the learning curves obtained by gradient descent over realistic datasets. Technically we compute averages of rational expressions involving random matrices using recent developments in random matrix theory based on "linear pencils". Another contribution, which is also of independent interest in random matrix theory, is a new derivation of related fixed point equations (and an extension there-off) using Dyson brownian motions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/22/2021

Model, sample, and epoch-wise descents: exact solution of gradient flow in the random feature model

Recent evidence has shown the existence of a so-called double-descent an...
research
07/27/2021

On the Role of Optimization in Double Descent: A Least Squares Study

Empirically it has been observed that the performance of deep neural net...
research
03/16/2023

Gradient flow on extensive-rank positive semi-definite matrix denoising

In this work, we present a new approach to analyze the gradient flow for...
research
03/01/2023

Learning curves for deep structured Gaussian feature models

In recent years, significant attention in deep learning theory has been ...
research
11/13/2019

A Model of Double Descent for High-dimensional Binary Linear Classification

We consider a model for logistic regression where only a subset of featu...
research
10/23/2018

A Continuous-Time View of Early Stopping for Least Squares Regression

We study the statistical properties of the iterates generated by gradien...

Please sign up or login with your details

Forgot password? Click here to reset