Sobolev-type embeddings for neural network approximation spaces

10/28/2021
by   Philipp Grohs, et al.
0

We consider neural network approximation spaces that classify functions according to the rate at which they can be approximated (with error measured in L^p) by ReLU neural networks with an increasing number of coefficients, subject to bounds on the magnitude of the coefficients and the number of hidden layers. We prove embedding theorems between these spaces for different values of p. Furthermore, we derive sharp embeddings of these approximation spaces into Hölder spaces. We find that, analogous to the case of classical function spaces (such as Sobolev spaces, or Besov spaces) it is possible to trade "smoothness" (i.e., approximation rate) for increased integrability. Combined with our earlier results in [arXiv:2104.02746], our embedding theorems imply a somewhat surprising fact related to "learning" functions from a given neural network space based on point samples: if accuracy is measured with respect to the uniform norm, then an optimal "learning" algorithm for reconstructing functions that are well approximable by ReLU neural networks is simply given by piecewise constant interpolation on a tensor product grid.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/03/2019

Approximation spaces of deep neural networks

We study the expressivity of deep neural networks. Measuring a network's...
research
05/25/2020

Approximation in shift-invariant spaces with deep ReLU neural networks

We construct deep ReLU neural networks to approximate functions in dilat...
research
06/18/2019

Barron Spaces and the Compositional Function Spaces for Neural Network Models

One of the key issues in the analysis of machine learning models is to i...
research
09/06/2022

Haar frame characterizations of Besov-Sobolev spaces and optimal embeddings into their dyadic counterparts

We study the behavior of Haar coefficients in Besov and Triebel-Lizorkin...
research
03/23/2021

The Newton Product of Polynomial Projectors. Part 2 : approximation properties

We prove that the Newton product of efficient polynomial projectors is s...
research
11/18/2020

Neural network approximation and estimation of classifiers with classification boundary in a Barron class

We prove bounds for the approximation and estimation of certain classifi...
research
05/30/2023

Embedding Inequalities for Barron-type Spaces

One of the fundamental problems in deep learning theory is understanding...

Please sign up or login with your details

Forgot password? Click here to reset