Sparse-grid sampling recovery and deep ReLU neural networks in high-dimensional approximation

07/17/2020
by   Dinh Dũng, et al.
0

We investigate approximations of functions from the Hölder-Zygmund space of mixed smoothness H^α_∞(𝕀^d) defined on the d-dimensional unit cube 𝕀^d:=[0,1]^d, by linear algorithms of sparse-grid sampling recovery and by deep ReLU (Rectified Linear Unit) neural networks when the dimension d may be very large. The approximation error is measured in the norm of the isotropic Sobolev space W^1,p_0(𝕀^d). The optimality of this sampling recovery is studied in terms of sampling n-widths. Optimal linear sampling algorithms are constructed on sparse grids using the piece-wise linear B-spline interpolation representation. We prove some tight dimension-dependent bounds of the sampling n-widths explicit in d and n. Based on the results on sampling recovery, we investigate the expressive power of deep ReLU neural networks to approximate functions in Hölder-Zygmund space. Namely, for any function f∈ H^α_∞(𝕀^d), we explicitly construct a deep ReLU neural network having an output that approximates f in the W_0^1,p(𝕀^d)-norm with a prescribed accuracy ε, and prove tight dimension-dependent bounds of the computation complexity of this approximation, characterized as the number of weights and the depth of this deep ReLU neural network, explicitly in d and ε. Moreover, we show that under a certain restriction the curse of dimensionality can be avoided in the approximations by sparse-grid sampling recovery and deep ReLU neural networks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/01/2021

Computation complexity of deep ReLU neural networks in high-dimensional approximation

The purpose of the present paper is to study the computation complexity ...
research
06/05/2023

Does a sparse ReLU network training problem always admit an optimum?

Given a training set, a loss function, and a neural network architecture...
research
10/22/2019

Global Capacity Measures for Deep ReLU Networks via Path Sampling

Classical results on the statistical complexity of linear models have co...
research
06/05/2020

Expressivity of expand-and-sparsify representations

A simple sparse coding mechanism appears in the sensory systems of sever...
research
02/08/2021

High-dimensional nonlinear approximation by parametric manifolds in Hölder-Nikol'skii spaces of mixed smoothness

We study high-dimensional nonlinear approximation of functions in Hölder...
research
03/04/2022

Ridges, Neural Networks, and the Radon Transform

A ridge is a function that is characterized by a one-dimensional profile...
research
09/30/2022

Overparameterized ReLU Neural Networks Learn the Simplest Models: Neural Isometry and Exact Recovery

The practice of deep learning has shown that neural networks generalize ...

Please sign up or login with your details

Forgot password? Click here to reset