Spectrum Dependent Learning Curves in Kernel Regression and Wide Neural Networks

02/07/2020
by   Blake Bordelon, et al.
0

A fundamental question in modern machine learning is how deep neural networks can generalize. We address this question using 1) an equivalence between training infinitely wide neural networks and performing kernel regression with a deterministic kernel called the Neural Tangent Kernel (NTK) (Jacot et al. 2018), and 2) theoretical tools from statistical physics. We derive analytical expressions for learning curves for kernel regression, and use them to evaluate how the test loss of a trained neural network depends on the number of samples. Our approach allows us not only to compute the total test risk but also the decomposition of the risk due to different spectral components of the kernel. Complementary to recent results showing that during gradient descent, neural networks fit low frequency components first, we identify a new type of frequency principle: as the size of the training set size grows, kernel machines and neural networks begin to fit successively higher frequency modes of the target function. We verify our theory with simulations of kernel regression and training wide artificial neural networks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/14/2023

Is Solving Graph Neural Tangent Kernel Equivalent to Training Graph Neural Network?

A rising trend in theoretical deep learning is to understand why deep le...
research
06/04/2021

Out-of-Distribution Generalization in Kernel Regression

In real word applications, data generating process for training a machin...
research
08/01/2023

An Exact Kernel Equivalence for Finite Classification Models

We explore the equivalence between neural networks and kernel methods by...
research
12/30/2019

Disentangling trainability and generalization in deep learning

A fundamental goal in deep learning is the characterization of trainabil...
research
09/04/2022

On Kernel Regression with Data-Dependent Kernels

The primary hyperparameter in kernel regression (KR) is the choice of ke...
research
06/23/2020

Statistical Mechanics of Generalization in Kernel Regression

Generalization beyond a training dataset is a main goal of machine learn...
research
02/28/2021

Asymptotic Risk of Overparameterized Likelihood Models: Double Descent Theory for Deep Neural Networks

We investigate the asymptotic risk of a general class of overparameteriz...

Please sign up or login with your details

Forgot password? Click here to reset