Kinetic Langevin MCMC Sampling Without Gradient Lipschitz Continuity – the Strongly Convex Case

01/19/2023
by   Tim Johnston, et al.
0

In this article we consider sampling from log concave distributions in Hamiltonian setting, without assuming that the objective gradient is globally Lipschitz. We propose two algorithms based on monotone polygonal (tamed) Euler schemes, to sample from a target measure, and provide non-asymptotic 2-Wasserstein distance bounds between the law of the process of each algorithm and the target measure. Finally, we apply these results to bound the excess risk optimization error of the associated optimization problem.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/20/2019

Bounding the error of discretized Langevin algorithms for non-strongly log-concave targets

In this paper, we provide non-asymptotic upper bounds on the error of sa...
research
09/12/2019

The Randomized Midpoint Method for Log-Concave Sampling

Sampling from log-concave distributions is a well researched problem tha...
research
08/21/2018

Non-asymptotic bounds for sampling algorithms without log-concavity

Discrete time analogues of ergodic stochastic differential equations (SD...
research
01/26/2021

The Langevin Monte Carlo algorithm in the non-smooth log-concave case

We prove non-asymptotic polynomial bounds on the convergence of the Lang...
research
07/12/2017

Underdamped Langevin MCMC: A non-asymptotic analysis

We study the underdamped Langevin diffusion when the log of the target d...
research
08/22/2023

Nonlinear Hamiltonian Monte Carlo its Particle Approximation

We present a nonlinear (in the sense of McKean) generalization of Hamilt...
research
03/07/2020

The NuZZ: Numerical ZigZag Sampling for General Models

We present the Numerical ZigZag (NuZZ) algorithm, a Piecewise Determinis...

Please sign up or login with your details

Forgot password? Click here to reset