Primal Dual Interpretation of the Proximal Stochastic Gradient Langevin Algorithm

06/16/2020
by   Adil Salim, et al.
16

We consider the task of sampling with respect to a log concave probability distribution. The potential of the target distribution is assumed to be composite, i.e., written as the sum of a smooth convex term, and a nonsmooth convex term possibly taking infinite values. The target distribution can be seen as a minimizer of the Kullback-Leibler divergence defined on the Wasserstein space (i.e., the space of probability measures). In the first part of this paper, we establish a strong duality result for this minimization problem. In the second part of this paper, we use the duality gap arising from the first part to study the complexity of the Proximal Stochastic Gradient Langevin Algorithm (PSGLA), which can be seen as a generalization of the Projected Langevin Algorithm. Our approach relies on viewing PSGLA as a primal dual algorithm and covers many cases where the target distribution is not fully supported. In particular, we show that if the potential is strongly convex, the complexity of PSGLA is (1/ε^2) in terms of the 2-Wasserstein distance. In contrast, the complexity of the Projected Langevin Algorithm is (1/ε^12) in terms of total variation when the potential is convex.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/28/2019

Stochastic Proximal Langevin Algorithm: Potential Splitting and Nonasymptotic Rates

We propose a new algorithm---Stochastic Proximal Langevin Algorithm (SPL...
research
02/07/2020

Wasserstein Proximal Gradient

We consider the task of sampling from a log-concave probability distribu...
research
06/11/2020

Stochastic Saddle-Point Optimization for Wasserstein Barycenters

We study the computation of non-regularized Wasserstein barycenters of p...
research
12/22/2020

Projected Stochastic Gradient Langevin Algorithms for Constrained Sampling and Non-Convex Learning

Langevin algorithms are gradient descent methods with additive noise. Th...
research
08/15/2022

Nesterov smoothing for sampling without smoothness

We study the problem of sampling from a target distribution in ℝ^d whose...
research
02/26/2018

Analysis of Langevin Monte Carlo via convex optimization

In this paper, we provide new insights on the Unadjusted Langevin Algori...
research
06/10/2020

Composite Logconcave Sampling with a Restricted Gaussian Oracle

We consider sampling from composite densities on ℝ^d of the form dπ(x) ∝...

Please sign up or login with your details

Forgot password? Click here to reset