Non-asymptotic bounds for sampling algorithms without log-concavity

08/21/2018
by   Mateusz B. Majka, et al.
0

Discrete time analogues of ergodic stochastic differential equations (SDEs) are one of the most popular and flexible tools for sampling high-dimensional probability measures. Non-asymptotic analysis in the L^2 Wasserstein distance of sampling algorithms based on Euler discretisations of SDEs has been recently developed by several authors for log-concave probability distributions. In this work we replace the log-concavity assumption with a log-concavity at infinity condition. We provide novel L^2 convergence rates for Euler schemes, expressed explicitly in terms of problem parameters. From there we derive non-asymptotic bounds on the distance between the laws induced by Euler schemes and the invariant laws of SDEs, both for schemes with standard and with randomised (inaccurate) drifts. We also obtain bounds for the hierarchy of discretisation, which enables us to deploy a multi-level Monte Carlo estimator. Our proof relies on a novel construction of a coupling for the Markov chains that can be used to control both the L^1 and L^2 Wasserstein distances simultaneously. Finally, we provide a weak convergence analysis that covers both the standard and the randomised (inaccurate) drift case. In particular, we reveal that the variance of the randomised drift does not influence the rate of weak convergence of the Euler scheme to the SDE.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/01/2020

Discrete-time Simulation of Stochastic Volterra Equations

We study discrete-time simulation schemes for stochastic Volterra equati...
research
02/08/2016

Guarantees in Wasserstein Distance for the Langevin Monte Carlo Algorithm

We study the problem of sampling from a distribution using the Langevin...
research
09/17/2019

Approximation of SDEs -- a stochastic sewing approach

We give a new take on the error analysis of approximations of stochastic...
research
01/19/2023

Kinetic Langevin MCMC Sampling Without Gradient Lipschitz Continuity – the Strongly Convex Case

In this article we consider sampling from log concave distributions in H...
research
04/19/2023

Weak Convergence Of Tamed Exponential Integrators for Stochastic Differential Equations

We prove weak convergence of order one for a class of exponential based ...
research
06/24/2018

The CLT in high dimensions: quantitative bounds via martingale embedding

We introduce a new method for obtaining quantitative convergence rates f...
research
10/20/2018

Wasserstein-based methods for convergence complexity analysis of MCMC with application to Albert and Chib's algorithm

Over the last 25 years, techniques based on drift and minorization (d&m)...

Please sign up or login with your details

Forgot password? Click here to reset