Machine Learning from a Continuous Viewpoint

by   Weinan E, et al.

We present a continuous formulation of machine learning, as a problem in the calculus of variations and differential-integral equations, very much in the spirit of classical numerical analysis and statistical physics. We demonstrate that conventional machine learning models and algorithms, such as the random feature model, the shallow neural network model and the residual neural network model, can all be recovered as particular discretizations of different continuous formulations. We also present examples of new models, such as the flow-based random feature model, and new algorithms, such as the smoothed particle method and spectral method, that arise naturally from this continuous formulation. We discuss how the issues of generalization error and implicit regularization can be studied under this framework.


page 1

page 2

page 3

page 4


On the Generalization Properties of Minimum-norm Solutions for Over-parameterized Neural Network Models

We study the generalization properties of minimum-norm solutions for thr...

Physics-Guided Recurrent Graph Networks for Predicting Flow and Temperature in River Networks

This paper proposes a physics-guided machine learning approach that comb...

Set2Graph: Learning Graphs From Sets

Many problems in machine learning (ML) can be cast as learning functions...

Predictions of Reynolds and Nusselt numbers in turbulent convection using machine-learning models

In this paper, we develop a multivariate regression model and a neural n...

A Shooting Formulation of Deep Learning

Continuous-depth neural networks can be viewed as deep limits of discret...

Reconciled Polynomial Machine: A Unified Representation of Shallow and Deep Learning Models

In this paper, we aim at introducing a new machine learning model, namel...

A brief introduction to the Grey Machine Learning

This paper presents a brief introduction to the key points of the Grey M...