DeepAI AI Chat
Log In Sign Up

Compositional Sparsity, Approximation Classes, and Parametric Transport Equations

by   Wolfgang Dahmen, et al.

Approximating functions of a large number of variables poses particular challenges often subsumed under the term "Curse of Dimensionality". Unless the approximated function exhibits a very high level of smoothness the Curse can be avoided only by exploiting some typically hidden structural sparsity. In this paper we propose a general framework for model classes of functions in high dimensions based on suitable notions of compositional sparsity quantifying approximability by highly nonlinear expressions such as deep neural networks. The relevance of these concepts are demonstrated for solution manifolds of parametric transport equations which are known not to enjoy the type of high order regularity of parameter-to-solution maps that help to avoid the Curse of Dimenbsionality in other model scenarios. Compositional sparsity is shown to serve as the key mechanism for proving that sparsity of problem data is inherited in a quantifiable way by the solution manifold. In particular, one obtains convergence rates for deep neural network realizations showing that the Curse of Dimensionality is indeed avoided.


page 1

page 2

page 3

page 4


Approximation by tree tensor networks in high dimensions: Sobolev and compositional functions

This paper is concerned with convergence estimates for fully discrete tr...

Computing Lyapunov functions using deep neural networks

We propose a deep neural network architecture and a training algorithm f...

Efficient Approximation of Solutions of Parametric Linear Transport Equations by ReLU DNNs

We demonstrate that deep neural networks with the ReLU activation functi...

Deep neural network surrogates for non-smooth quantities of interest in shape uncertainty quantification

We consider the point evaluation of the solution to interface problems w...

Deep neural network approximations for Monte Carlo algorithms

Recently, it has been proposed in the literature to employ deep neural n...

An Extension of Averaged-Operator-Based Algorithms

Many of the algorithms used to solve minimization problems with sparsity...