Non-asymptotic convergence bounds for modified tamed unadjusted Langevin algorithm in non-convex setting

07/06/2022
by   Ariel Neufeld, et al.
0

We consider the problem of sampling from a high-dimensional target distribution π_β on ℝ^d with density proportional to θ↦ e^-β U(θ) using explicit numerical schemes based on discretising the Langevin stochastic differential equation (SDE). In recent literature, taming has been proposed and studied as a method for ensuring stability of Langevin-based numerical schemes in the case of super-linearly growing drift coefficients for the Langevin SDE. In particular, the Tamed Unadjusted Langevin Algorithm (TULA) was proposed in [Bro+19] to sample from such target distributions with the gradient of the potential U being super-linearly growing. However, theoretical guarantees in Wasserstein distances for Langevin-based algorithms have traditionally been derived assuming strong convexity of the potential U. In this paper, we propose a novel taming factor and derive, under a setting with possibly non-convex potential U and super-linearly growing gradient of U, non-asymptotic theoretical bounds in Wasserstein-1 and Wasserstein-2 distances between the law of our algorithm, which we name the modified Tamed Unadjusted Langevin Algorithm (mTULA), and the target distribution π_β. We obtain respective rates of convergence 𝒪(λ) and 𝒪(λ^1/2) in Wasserstein-1 and Wasserstein-2 distances for the discretisation error of mTULA in step size λ. High-dimensional numerical simulations which support our theoretical findings are presented to showcase the applicability of our algorithm.

READ FULL TEXT
research
07/19/2021

Non-asymptotic estimates for TUSLA algorithm for non-convex learning with applications to neural networks with ReLU activation function

We consider non-convex stochastic optimization problems where the object...
research
07/02/2020

A fully data-driven approach to minimizing CVaR for portfolio of assets via SGLD with discontinuous updating

A new approach in stochastic optimization via the use of stochastic grad...
research
11/06/2019

Weak convergence of empirical Wasserstein type distances

We estimate contrasts ∫_0 ^1 ρ(F^-1(u)-G^-1(u))du between two continuous...
research
06/15/2021

Non-asymptotic convergence bounds for Wasserstein approximation using point clouds

Several issues in machine learning and inverse problems require to gener...
research
10/25/2022

A Dynamical System View of Langevin-Based Non-Convex Sampling

Non-convex sampling is a key challenge in machine learning, central to n...
research
05/05/2016

High-dimensional Bayesian inference via the Unadjusted Langevin Algorithm

We consider in this paper the problem of sampling a high-dimensional pro...
research
07/10/2020

High-dimensional MCMC with a standard splitting scheme for the underdamped Langevin diffusion

The efficiency of a markov sampler based on the underdamped Langevin dif...

Please sign up or login with your details

Forgot password? Click here to reset