Weak Convergence Of Tamed Exponential Integrators for Stochastic Differential Equations

04/19/2023
by   Utku Erdogan, et al.
0

We prove weak convergence of order one for a class of exponential based integrators for SDEs with non-globally Lipschtiz drift. Our analysis covers tamed versions of Geometric Brownian Motion (GBM) based methods as well as the standard exponential schemes. The numerical performance of both the GBM and exponential tamed methods through four different multi-level Monte Carlo techniques are compared. We observe that for linear noise the standard exponential tamed method requires severe restrictions on the stepsize unlike the GBM tamed method.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/26/2019

Lawson schemes for highly oscillatory stochastic differential equations and conservation of invariants

In this paper, we consider a class of stochastic midpoint and trapezoida...
research
09/13/2023

Exp[licit]-A Robot modeling Software based on Exponential Maps

Deriving a robot's equation of motion typically requires placing multipl...
research
06/21/2021

Strong Convergence of a GBM Based Tamed Integrator for SDEs and an Adaptive Implementation

We introduce a tamed exponential time integrator which exploits linear t...
research
04/01/2020

Discrete-time Simulation of Stochastic Volterra Equations

We study discrete-time simulation schemes for stochastic Volterra equati...
research
10/28/2019

Exponential methods for solving hyperbolic problems with application to kinetic equations

The efficient numerical solution of many kinetic models in plasma physic...
research
02/11/2023

UGAE: A Novel Approach to Non-exponential Discounting

The discounting mechanism in Reinforcement Learning determines the relat...
research
08/21/2018

Non-asymptotic bounds for sampling algorithms without log-concavity

Discrete time analogues of ergodic stochastic differential equations (SD...

Please sign up or login with your details

Forgot password? Click here to reset