Weighted variation spaces and approximation by shallow ReLU networks

07/28/2023
by   Ronald DeVore, et al.
0

We investigate the approximation of functions f on a bounded domain Ω⊂ℝ^d by the outputs of single-hidden-layer ReLU neural networks of width n. This form of nonlinear n-term dictionary approximation has been intensely studied since it is the simplest case of neural network approximation (NNA). There are several celebrated approximation results for this form of NNA that introduce novel model classes of functions on Ω whose approximation rates avoid the curse of dimensionality. These novel classes include Barron classes, and classes based on sparsity or variation such as the Radon-domain BV classes. The present paper is concerned with the definition of these novel model classes on domains Ω. The current definition of these model classes does not depend on the domain Ω. A new and more proper definition of model classes on domains is given by introducing the concept of weighted variation spaces. These new model classes are intrinsic to the domain itself. The importance of these new model classes is that they are strictly larger than the classical (domain-independent) classes. Yet, it is shown that they maintain the same NNA rates.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/28/2021

Characterization of the Variation Spaces Corresponding to Shallow Neural Networks

We consider the variation space corresponding to a dictionary of functio...
research
04/04/2023

Optimal rates of approximation by shallow ReLU^k neural networks and applications to nonparametric regression

We study the approximation capacity of some variation spaces correspondi...
research
07/30/2020

Approximation of Smoothness Classes by Deep ReLU Networks

We consider approximation rates of sparsely connected deep rectified lin...
research
05/05/2019

Nonlinear Approximation and (Deep) ReLU Networks

This article is concerned with the approximation and expressive powers o...
research
05/31/2021

Towards Lower Bounds on the Depth of ReLU Neural Networks

We contribute to a better understanding of the class of functions that i...
research
01/28/2021

Approximation with Tensor Networks. Part III: Multivariate Approximation

We study the approximation of multivariate functions with tensor network...
research
05/11/2022

An Inexact Augmented Lagrangian Algorithm for Training Leaky ReLU Neural Network with Group Sparsity

The leaky ReLU network with a group sparse regularization term has been ...

Please sign up or login with your details

Forgot password? Click here to reset