
Machine Learning from a Continuous Viewpoint
We present a continuous formulation of machine learning, as a problem in...
read it

Smaller generalization error derived for deep compared to shallow residual neural networks
Estimates of the generalization error are proved for a residual neural n...
read it

Barron Spaces and the Compositional Function Spaces for Neural Network Models
One of the key issues in the analysis of machine learning models is to i...
read it

A Priori Estimates of the Population Risk for Residual Networks
Optimal a priori estimates are derived for the population risk of a regu...
read it

A Priori Estimates of the Generalization Error for Twolayer Neural Networks
New estimates for the generalization error are established for the twol...
read it

Analysis of the Gradient Descent Algorithm for a Deep Neural Network Model with Skipconnections
The behavior of the gradient descent (GD) algorithm is analyzed for a de...
read it

A priori generalization error for twolayer ReLU neural network through minimum norm solution
We focus on estimating a priori generalization error of twolayer ReLU n...
read it
On the Generalization Properties of Minimumnorm Solutions for Overparameterized Neural Network Models
We study the generalization properties of minimumnorm solutions for three overparametrized machine learning models including the random feature model, the twolayer neural network model and the residual network model. We proved that for all three models, the generalization error for the minimumnorm solution is comparable to the Monte Carlo rate, up to some logarithmic terms, as long as the models are sufficiently overparametrized.
READ FULL TEXT
Comments
There are no comments yet.