On the Generalization Power of Overfitted Two-Layer Neural Tangent Kernel Models

by   Peizhong Ju, et al.

In this paper, we study the generalization performance of min ℓ_2-norm overfitting solutions for the neural tangent kernel (NTK) model of a two-layer neural network. We show that, depending on the ground-truth function, the test error of overfitted NTK models exhibits characteristics that are different from the "double-descent" of other overparameterized linear models with simple Fourier or Gaussian features. Specifically, for a class of learnable functions, we provide a new upper bound of the generalization error that approaches a small limiting value, even when the number of neurons p approaches infinity. This limiting value further decreases with the number of training samples n. For functions outside of this class, we provide a lower bound on the generalization error that does not diminish to zero even when n and p are both large.



There are no comments yet.


page 1

page 2

page 3

page 4


Overfitting Can Be Harmless for Basis Pursuit: Only to a Degree

Recently, there have been significant interests in studying the generali...

On the Provable Generalization of Recurrent Neural Networks

Recurrent Neural Network (RNN) is a fundamental structure in deep learni...

An Information-Theoretic View for Deep Learning

Deep learning has transformed the computer vision, natural language proc...

Why Do Deep Residual Networks Generalize Better than Deep Feedforward Networks? – A Neural Tangent Kernel Perspective

Deep residual networks (ResNets) have demonstrated better generalization...

A Revision of Neural Tangent Kernel-based Approaches for Neural Networks

Recent theoretical works based on the neural tangent kernel (NTK) have s...

Smaller generalization error derived for deep compared to shallow residual neural networks

Estimates of the generalization error are proved for a residual neural n...

Asymptotic Generalization Bound of Fisher's Linear Discriminant Analysis

Fisher's linear discriminant analysis (FLDA) is an important dimension r...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.