On the Generalization Properties of Minimum-norm Solutions for Over-parameterized Neural Network Models

12/15/2019
by   Weinan E, et al.
10

We study the generalization properties of minimum-norm solutions for three over-parametrized machine learning models including the random feature model, the two-layer neural network model and the residual network model. We proved that for all three models, the generalization error for the minimum-norm solution is comparable to the Monte Carlo rate, up to some logarithmic terms, as long as the models are sufficiently over-parametrized.

READ FULL TEXT

page 1

page 2

page 3

page 4

12/30/2019

Machine Learning from a Continuous Viewpoint

We present a continuous formulation of machine learning, as a problem in...
10/05/2020

Smaller generalization error derived for deep compared to shallow residual neural networks

Estimates of the generalization error are proved for a residual neural n...
03/06/2019

A Priori Estimates of the Population Risk for Residual Networks

Optimal a priori estimates are derived for the population risk of a regu...
06/18/2019

Barron Spaces and the Compositional Function Spaces for Neural Network Models

One of the key issues in the analysis of machine learning models is to i...
10/15/2018

A Priori Estimates of the Generalization Error for Two-layer Neural Networks

New estimates for the generalization error are established for the two-l...
01/28/2021

Interpolating Classifiers Make Few Mistakes

This paper provides elementary analyses of the regret and generalization...
10/18/2021

Minimum ℓ_1-norm interpolators: Precise asymptotics and multiple descent

An evolving line of machine learning works observe empirical evidence th...