On the Computational Complexity of Metropolis-Adjusted Langevin Algorithms for Bayesian Posterior Sampling

06/13/2022
by   Rong Tang, et al.
0

In this paper, we study the computational complexity of sampling from a Bayesian posterior (or pseudo-posterior) using the Metropolis-adjusted Langevin algorithm (MALA). MALA first applies a discrete-time Langevin SDE to propose a new state, and then adjusts the proposed state using Metropolis-Hastings rejection. Most existing theoretical analysis of MALA relies on the smoothness and strongly log-concavity properties of the target distribution, which unfortunately is often unsatisfied in practical Bayesian problems. Our analysis relies on the statistical large sample theory, which restricts the deviation of the Bayesian posterior from being smooth and log-concave in a very specific manner. Specifically, we establish the optimal parameter dimension dependence of d^1/3 in the non-asymptotic mixing time upper bound for MALA after the burn-in period without assuming the smoothness and log-concavity of the target posterior density, where MALA is slightly modified by replacing the gradient with any subgradient if non-differentiable. In comparison, the well-known scaling limit for the classical Metropolis random walk (MRW) suggests a linear d dimension dependence in its mixing time. Thus, our results formally verify the conventional wisdom that MALA, as a first-order method using gradient information, is more efficient than MRW as a zeroth-order method only using function value information in the context of Bayesian computation.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/23/2020

Optimal dimension dependence of the Metropolis-Adjusted Langevin Algorithm

Conventional wisdom in the sampling literature, backed by a popular diff...
research
04/08/2023

A Simple Proof of the Mixing of Metropolis-Adjusted Langevin Algorithm under Smoothness and Isoperimetry

We study the mixing time of Metropolis-Adjusted Langevin algorithm (MALA...
research
09/27/2021

Minimax Mixing Time of the Metropolis-Adjusted Langevin Algorithm for Log-Concave Sampling

We study the mixing time of the Metropolis-adjusted Langevin algorithm (...
research
01/08/2018

Log-concave sampling: Metropolis-Hastings algorithms are fast!

We consider the problem of sampling from a strongly log-concave density ...
research
11/02/2022

Jump-Diffusion Langevin Dynamics for Multimodal Posterior Sampling

Bayesian methods of sampling from a posterior distribution are becoming ...
research
09/08/2021

Sqrt(d) Dimension Dependence of Langevin Monte Carlo

This article considers the popular MCMC method of unadjusted Langevin Mo...
research
11/23/2022

Efficient sampling of non log-concave posterior distributions with mixture of noises

This paper focuses on a challenging class of inverse problems that is of...

Please sign up or login with your details

Forgot password? Click here to reset