
A Proximal Algorithm for Sampling from Nonsmooth Potentials
Markov chain Monte Carlo (MCMC) is an effective and dominant method to s...
read it

Structured Logconcave Sampling with a Restricted Gaussian Oracle
We give algorithms for sampling several structured logconcave families t...
read it

Accelerated Stochastic Mirror Descent Algorithms For Composite Nonstrongly Convex Optimization
We consider the problem of minimizing the sum of an average function of ...
read it

Composite Logconcave Sampling with a Restricted Gaussian Oracle
We consider sampling from composite densities on ℝ^d of the form dπ(x) ∝...
read it

Composite SelfConcordant Minimization
We propose a variable metric framework for minimizing the sum of a self...
read it

Stochastic Proximal Langevin Algorithm: Potential Splitting and Nonasymptotic Rates
We propose a new algorithmStochastic Proximal Langevin Algorithm (SPL...
read it

Efficient sampling generation from explicit densities via Normalizing Flows
For many applications, such as computing the expected value of different...
read it
An Efficient Sampling Algorithm for Nonsmooth Composite Potentials
We consider the problem of sampling from a density of the form p(x) ∝(f(x) g(x)), where f: R^d →R is a smooth and strongly convex function and g: R^d →R is a convex and Lipschitz function. We propose a new algorithm based on the MetropolisHastings framework, and prove that it mixes to within TV distance ε of the target density in at most O(d log (d/ε)) iterations. This guarantee extends previous results on sampling from distributions with smooth log densities (g = 0) to the more general composite nonsmooth case, with the same mixing time up to a multiple of the condition number. Our method is based on a novel proximalbased proposal distribution that can be efficiently computed for a large class of nonsmooth functions g.
READ FULL TEXT
Comments
There are no comments yet.