Improved analysis for a proximal algorithm for sampling

by   Yongxin Chen, et al.

We study the proximal sampler of Lee, Shen, and Tian (2021) and obtain new convergence guarantees under weaker assumptions than strong log-concavity: namely, our results hold for (1) weakly log-concave targets, and (2) targets satisfying isoperimetric assumptions which allow for non-log-concavity. We demonstrate our results by obtaining new state-of-the-art sampling guarantees for several classes of target distributions. We also strengthen the connection between the proximal sampler and the proximal method in optimization by interpreting the proximal sampler as an entropically regularized Wasserstein proximal method, and the proximal point method as the limit of the proximal sampler with vanishing noise.


Structured Logconcave Sampling with a Restricted Gaussian Oracle

We give algorithms for sampling several structured logconcave families t...

Proximal Langevin Algorithm: Rapid Convergence Under Isoperimetry

We study the Proximal Langevin Algorithm (PLA) for sampling from a proba...

Skew Brownian Motion and Complexity of the ALPS Algorithm

Simulated tempering is a popular method of allowing MCMC algorithms to m...

Stochastic Proximal Langevin Algorithm: Potential Splitting and Nonasymptotic Rates

We propose a new algorithm---Stochastic Proximal Langevin Algorithm (SPL...

The split Gibbs sampler revisited: improvements to its algorithmic structure and augmented target distribution

This paper proposes a new accelerated proximal Markov chain Monte Carlo ...

Proximal Stochastic Dual Coordinate Ascent

We introduce a proximal version of dual coordinate ascent method. We dem...

The Skipping Sampler: A new approach to sample from complex conditional densities

We introduce the Skipping Sampler, a novel algorithm to efficiently samp...