Optimal Approximation Rates for Deep ReLU Neural Networks on Sobolev Spaces

11/25/2022
by   Jonathan W. Siegel, et al.
0

We study the problem of how efficiently, in terms of the number of parameters, deep neural networks with the ReLU activation function can approximate functions in the Sobolev space W^s(L_q(Ω)) on a bounded domain Ω, where the error is measured in L_p(Ω). This problem is important for studying the application of neural networks in scientific computing and has previously been solved only in the case p=q=∞. Our contribution is to provide a solution for all 1≤ p,q≤∞ and s > 0. Our results show that deep ReLU networks significantly outperform classical methods of approximation, but that this comes at the cost of parameters which are not encodable.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/25/2020

Approximation in shift-invariant spaces with deep ReLU neural networks

We construct deep ReLU neural networks to approximate functions in dilat...
research
05/03/2019

Approximation spaces of deep neural networks

We study the expressivity of deep neural networks. Measuring a network's...
research
05/05/2019

Nonlinear Approximation and (Deep) ReLU Networks

This article is concerned with the approximation and expressive powers o...
research
09/29/2021

Double framed moduli spaces of quiver representations

Motivated by problems in the neural networks setting, we study moduli sp...
research
05/05/2021

Two-layer neural networks with values in a Banach space

We study two-layer neural networks whose domain and range are Banach spa...
research
01/30/2020

Efficient Approximation of Solutions of Parametric Linear Transport Equations by ReLU DNNs

We demonstrate that deep neural networks with the ReLU activation functi...
research
04/06/2018

A comparison of deep networks with ReLU activation function and linear spline-type methods

Deep neural networks (DNNs) generate much richer function spaces than sh...

Please sign up or login with your details

Forgot password? Click here to reset