Discretization and Machine Learning Approximation of BSDEs with a Constraint on the Gains-Process

by   Idris Kharroubi, et al.

We study the approximation of backward stochastic differential equations (BSDEs for short) with a constraint on the gains process. We first discretize the constraint by applying a so-called facelift operator at times of a grid. We show that this discretely constrained BSDE converges to the continuously constrained one as the mesh grid converges to zero. We then focus on the approximation of the discretely constrained BSDE. For that we adopt a machine learning approach. We show that the facelift can be approximated by an optimization problem over a class of neural networks under constraints on the neural network and its derivative. We then derive an algorithm converging to the discretely constrained BSDE as the number of neurons goes to infinity. We end by numerical experiments. Mathematics Subject Classification (2010): 65C30, 65M75, 60H35, 93E20, 49L25.



There are no comments yet.


page 1

page 2

page 3

page 4


Deep Learning Schemes For Parabolic Nonlocal Integro-Differential Equations

In this paper we consider the numerical approximation of nonlocal integr...

Inverses of Matern Covariances on Grids

We conduct a theoretical and numerical study of the aliased spectral den...

Neural Network Architectures for Stochastic Control using the Nonlinear Feynman-Kac Lemma

In this paper we propose a new methodology for decision-making under unc...

Multilevel Picard approximation algorithm for semilinear partial integro-differential equations and its complexity analysis

In this paper we introduce a multilevel Picard approximation algorithm f...

Randomized derivative-free Milstein algorithm for efficient approximation of solutions of SDEs under noisy information

We deal with pointwise approximation of solutions of scalar stochastic d...

Two "correlation games" for a nonlinear network with Hebbian excitatory neurons and anti-Hebbian inhibitory neurons

A companion paper introduces a nonlinear network with Hebbian excitatory...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.