Deep Learning in High Dimension: Neural Network Approximation of Analytic Functions in L^2(ℝ^d,γ_d)

11/13/2021
by   Christoph Schwab, et al.
0

For artificial deep neural networks, we prove expression rates for analytic functions f:ℝ^d→ℝ in the norm of L^2(ℝ^d,γ_d) where d∈ℕ∪{∞}. Here γ_d denotes the Gaussian product probability measure on ℝ^d. We consider in particular ReLU and ReLU^k activations for integer k≥ 2. For d∈ℕ, we show exponential convergence rates in L^2(ℝ^d,γ_d). In case d=∞, under suitable smoothness and sparsity assumptions on f:ℝ^ℕ→ℝ, with γ_∞ denoting an infinite (Gaussian) product measure on ℝ^ℕ, we prove dimension-independent expression rate bounds in the norm of L^2(ℝ^ℕ,γ_∞). The rates only depend on quantified holomorphy of (an analytic continuation of) the map f to a product of strips in ℂ^d. As an application, we prove expression rate bounds of deep ReLU-NNs for response surfaces of elliptic PDEs with log-Gaussian random field inputs.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset
Success!
Error Icon An error occurred

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro