On the Banach spaces associated with multi-layer ReLU networks: Function representation, approximation theory and gradient descent dynamics

07/30/2020 ∙ by Weinan E, et al. ∙ 11

We develop Banach spaces for ReLU neural networks of finite depth L and infinite width. The spaces contain all finite fully connected L-layer networks and their L^2-limiting objects under bounds on the natural path-norm. Under this norm, the unit ball in the space for L-layer networks has low Rademacher complexity and thus favorable generalization properties. Functions in these spaces can be approximated by multi-layer neural networks with dimension-independent convergence rates. The key to this work is a new way of representing functions in some form of expectations, motivated by multi-layer neural networks. This representation allows us to define a new class of continuous models for machine learning. We show that the gradient flow defined this way is the natural continuous analog of the gradient descent dynamics for the associated multi-layer neural networks. We show that the path-norm increases at most polynomially under this continuous gradient flow dynamics.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.