
Computation complexity of deep ReLU neural networks in highdimensional approximation
The purpose of the present paper is to study the computation complexity ...
read it

Sparsegrid sampling recovery and deep ReLU neural networks in highdimensional approximation
We investigate approximations of functions from the HölderZygmund space...
read it

Approximation with Tensor Networks. Part I: Approximation Spaces
We study the approximation of functions by tensor networks (TNs). We sho...
read it

Constructive sparse trigonometric approximation for functions with small mixed smoothness
The paper gives a constructive method, based on greedy algorithms, that ...
read it

Expressivity of expandandsparsify representations
A simple sparse coding mechanism appears in the sensory systems of sever...
read it

Manifold learning with arbitrary norms
Manifold learning methods play a prominent role in nonlinear dimensional...
read it

How anisotropic mixed smoothness affects the decay of singular numbers of Sobolev embeddings
We continue the research on the asymptotic and preasymptotic decay of si...
read it
Highdimensional nonlinear approximation by parametric manifolds in HölderNikol'skii spaces of mixed smoothness
We study highdimensional nonlinear approximation of functions in HölderNikol'skii spaces H^α_∞(𝕀^d) on the unit cube 𝕀^d:=[0,1]^d having mixed smoothness, by parametric manifolds. The approximation error is measured in the L_∞norm. In this context, we explicitly constructed methods of nonlinear approximation, and give dimensiondependent estimates of the approximation error explicitly in dimension d and number N measuring computation complexity of the parametric manifold of approximants. For d=2, we derived a novel right asymptotic order of noncontinuous manifold Nwidths of the unit ball of H^α_∞(𝕀^2) in the space L_∞(𝕀^2). In constructing approximation methods, the function decomposition by the tensor product Faber series and special representations of its truncations on sparse grids play a central role.
READ FULL TEXT
Comments
There are no comments yet.