DeepAI AI Chat
Log In Sign Up

Deep ReLU Neural Network Approximation for Stochastic Differential Equations with Jumps

by   Lukas Gonon, et al.

Deep neural networks (DNNs) with ReLU activation function are proved to be able to express viscosity solutions of linear partial integrodifferental equations (PIDEs) on state spaces of possibly high dimension d. Admissible PIDEs comprise Kolmogorov equations for high-dimensional diffusion, advection, and for pure jump Lévy processes. We prove for such PIDEs arising from a class of jump-diffusions on ℝ^d, that for any compact K⊂ℝ^d, there exist constants C,𝔭,𝔮>0 such that for every ε∈ (0,1] and for every d∈ℕ the nomalized (over K) DNN L^2-expression error of viscosity solutions of the PIDE is of size ε with DNN size bounded by Cd^𝔭ε^-𝔮. In particular, the constant C>0 is independent of d∈ℕ and of ε∈ (0,1] and depends only on the coefficients in the PIDE and the measure used to quantify the error. This establishes that ReLU DNNs can break the curse of dimensionality (CoD for short) for viscosity solutions of linear, possibly degenerate PIDEs corresponding to Markovian jump-diffusion processes. As a consequence of the employed techniques we also obtain that expectations of a large class of path-dependent functionals of the underlying jump-diffusion processes can be expressed without the CoD.


page 1

page 2

page 3

page 4


Efficient Approximation of Solutions of Parametric Linear Transport Equations by ReLU DNNs

We demonstrate that deep neural networks with the ReLU activation functi...

Deep ReLU Network Expression Rates for Option Prices in high-dimensional, exponential Lévy models

We study the expression rates of deep neural networks (DNNs for short) f...

ReLU Deep Neural Networks and linear Finite Elements

In this paper, we investigate the relationship between deep neural netwo...

Deep Neural Networks with ReLU-Sine-Exponential Activations Break Curse of Dimensionality on Hölder Class

In this paper, we construct neural networks with ReLU, sine and 2^x as a...

Least-Squares Neural Network (LSNN) Method for Linear Advection-Reaction Equation: General Discontinuous Interface

We studied the least-squares ReLU neural network method (LSNN) for solvi...

Deep Neural Networks Are Effective At Learning High-Dimensional Hilbert-Valued Functions From Limited Data

The accurate approximation of scalar-valued functions from sample points...