Deep neural networks overcome the curse of dimensionality in the numerical approximation of semilinear partial differential equations

05/28/2022
by   Petru A. Cioica-Licht, et al.
0

We prove that deep neural networks are capable of approximating solutions of semilinear Kolmogorov PDE in the case of gradient-independent, Lipschitz-continuous nonlinearities, while the required number of parameters in the networks grow at most polynomially in both dimension d ∈ℕ and prescribed reciprocal accuracy ε. Previously, this has only been proven in the case of semilinear heat equations.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/03/2020

Space-time deep neural network approximations for high-dimensional partial differential equations

It is one of the most challenging issues in applied mathematics to appro...
research
10/30/2021

Approximation properties of Residual Neural Networks for Kolmogorov PDEs

In recent years residual neural networks (ResNets) as introduced by [He,...
research
04/27/2020

Estimating Full Lipschitz Constants of Deep Neural Networks

We estimate the Lipschitz constants of the gradient of a deep neural net...
research
01/23/2020

Overcoming the curse of dimensionality for approximating Lyapunov functions with deep neural networks under a small-gain condition

We propose a deep neural network architecture for storing approximate Ly...
research
12/13/2019

On the approximation of rough functions with deep neural networks

Deep neural networks and the ENO procedure are both efficient frameworks...
research
09/28/2022

Deep learning for gradient flows using the Brezis-Ekeland principle

We propose a deep learning method for the numerical solution of partial ...
research
04/01/2022

Deep neural networks for solving extremely large linear systems

In this paper, we study deep neural networks for solving extremely large...

Please sign up or login with your details

Forgot password? Click here to reset