Deep ReLU Neural Network Approximation for Stochastic Differential Equations with Jumps

02/23/2021
by   Lukas Gonon, et al.
0

Deep neural networks (DNNs) with ReLU activation function are proved to be able to express viscosity solutions of linear partial integrodifferental equations (PIDEs) on state spaces of possibly high dimension d. Admissible PIDEs comprise Kolmogorov equations for high-dimensional diffusion, advection, and for pure jump Lévy processes. We prove for such PIDEs arising from a class of jump-diffusions on ℝ^d, that for any compact K⊂ℝ^d, there exist constants C,𝔭,𝔮>0 such that for every ε∈ (0,1] and for every d∈ℕ the nomalized (over K) DNN L^2-expression error of viscosity solutions of the PIDE is of size ε with DNN size bounded by Cd^𝔭ε^-𝔮. In particular, the constant C>0 is independent of d∈ℕ and of ε∈ (0,1] and depends only on the coefficients in the PIDE and the measure used to quantify the error. This establishes that ReLU DNNs can break the curse of dimensionality (CoD for short) for viscosity solutions of linear, possibly degenerate PIDEs corresponding to Markovian jump-diffusion processes. As a consequence of the employed techniques we also obtain that expectations of a large class of path-dependent functionals of the underlying jump-diffusion processes can be expressed without the CoD.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/30/2020

Efficient Approximation of Solutions of Parametric Linear Transport Equations by ReLU DNNs

We demonstrate that deep neural networks with the ReLU activation functi...
research
01/28/2021

Deep ReLU Network Expression Rates for Option Prices in high-dimensional, exponential Lévy models

We study the expression rates of deep neural networks (DNNs for short) f...
research
04/12/2023

Deep neural network approximation of composite functions without the curse of dimensionality

In this article we identify a general class of high-dimensional continuo...
research
05/18/2021

ReLU Deep Neural Networks and linear Finite Elements

In this paper, we investigate the relationship between deep neural netwo...
research
08/10/2023

On the Optimal Expressive Power of ReLU DNNs and Its Application in Approximation with Kolmogorov Superposition Theorem

This paper is devoted to studying the optimal expressive power of ReLU d...
research
01/15/2023

Least-Squares Neural Network (LSNN) Method for Linear Advection-Reaction Equation: General Discontinuous Interface

We studied the least-squares ReLU neural network method (LSNN) for solvi...
research
05/21/2021

Error Bounds of the Invariant Statistics in Machine Learning of Ergodic Itô Diffusions

This paper studies the theoretical underpinnings of machine learning of ...

Please sign up or login with your details

Forgot password? Click here to reset