Optimal Scaling for the Proximal Langevin Algorithm in High Dimensions

04/21/2022
by   Natesh S. Pillai, et al.
0

The Metropolis-adjusted Langevin (MALA) algorithm is a sampling algorithm that incorporates the gradient of the logarithm of the target density in its proposal distribution. In an earlier joint work <cit.>, the author had extended the seminal work of <cit.> and showed that in stationarity, MALA applied to an N-dimensional approximation of the target will take O(N^1/3) steps to explore its target measure. It was also shown in <cit.> that, as a consequence of the diffusion limit, the MALA algorithm is optimized at an average acceptance probability of 0.574. In <cit.>, Pereyra introduced the proximal MALA algorithm where the gradient of the log target density is replaced by the proximal function (mainly aimed at implementing MALA non-differentiable target densities). In this paper, we show that for a wide class of twice differentiable target densities, the proximal MALA enjoys the same optimal scaling as that of MALA in high dimensions and also has an average optimal acceptance probability of 0.574. The results of this paper thus give the following practically useful guideline: for smooth target densities where it is expensive to compute the gradient while implementing MALA, users may replace the gradient with the corresponding proximal function (that can be often computed relatively cheaply via convex optimization) without losing any efficiency. This confirms some of the empirical observations made in <cit.>.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/01/2019

An Efficient Sampling Algorithm for Non-smooth Composite Potentials

We consider the problem of sampling from a density of the form p(x) ∝(-f...
research
03/15/2021

DIPPA: An improved Method for Bilinear Saddle Point Problems

This paper studies bilinear saddle point problems min_xmax_y g(x) + x^⊤A...
research
01/06/2023

Optimal Scaling Results for a Wide Class of Proximal MALA Algorithms

We consider a recently proposed class of MCMC methods which uses proximi...
research
03/04/2022

Sharper Bounds for Proximal Gradient Algorithms with Errors

We analyse the convergence of the proximal gradient algorithm for convex...
research
02/07/2020

Wasserstein Proximal Gradient

We consider the task of sampling from a log-concave probability distribu...
research
04/27/2019

Optimal Scaling of Metropolis Algorithms on General Target Distributions

The main limitation of the existing optimal scaling results for Metropol...
research
07/12/2022

Implementing real polyhedral homotopy

We implement a real polyhedral homotopy method using three functions. Th...

Please sign up or login with your details

Forgot password? Click here to reset