Proximal Langevin Algorithm: Rapid Convergence Under Isoperimetry
We study the Proximal Langevin Algorithm (PLA) for sampling from a probability distribution ν = e^-f on R^n under isoperimetry. We prove a convergence guarantee for PLA in Kullback-Leibler (KL) divergence when ν satisfies log-Sobolev inequality (LSI) and f has bounded second and third derivatives. This improves on the result for the Unadjusted Langevin Algorithm (ULA), and matches the fastest known rate for sampling under LSI (without Metropolis filter) with a better dependence on the LSI constant. We also prove convergence guarantees for PLA in Rényi divergence of order q > 1 when the biased limit satisfies either LSI or Poincaré inequality.
READ FULL TEXT