On the Generalization Properties of Minimum-norm Solutions for Over-parameterized Neural Network Models

12/15/2019
by   Weinan E, et al.
10

We study the generalization properties of minimum-norm solutions for three over-parametrized machine learning models including the random feature model, the two-layer neural network model and the residual network model. We proved that for all three models, the generalization error for the minimum-norm solution is comparable to the Monte Carlo rate, up to some logarithmic terms, as long as the models are sufficiently over-parametrized.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/30/2019

Machine Learning from a Continuous Viewpoint

We present a continuous formulation of machine learning, as a problem in...
research
10/05/2020

Smaller generalization error derived for deep compared to shallow residual neural networks

Estimates of the generalization error are proved for a residual neural n...
research
03/06/2019

A Priori Estimates of the Population Risk for Residual Networks

Optimal a priori estimates are derived for the population risk of a regu...
research
06/18/2019

Barron Spaces and the Compositional Function Spaces for Neural Network Models

One of the key issues in the analysis of machine learning models is to i...
research
10/15/2018

A Priori Estimates of the Generalization Error for Two-layer Neural Networks

New estimates for the generalization error are established for the two-l...
research
01/28/2021

Interpolating Classifiers Make Few Mistakes

This paper provides elementary analyses of the regret and generalization...
research
10/14/2017

Network Model Selection Using Task-Focused Minimum Description Length

Networks are fundamental models for data used in practically every appli...

Please sign up or login with your details

Forgot password? Click here to reset