
On sampling from a logconcave density using kinetic Langevin diffusions
Langevin diffusion processes and their discretizations are often used fo...
read it

Composite Logconcave Sampling with a Restricted Gaussian Oracle
We consider sampling from composite densities on ℝ^d of the form dπ(x) ∝...
read it

Accelerated Stochastic Mirror Descent Algorithms For Composite Nonstrongly Convex Optimization
We consider the problem of minimizing the sum of an average function of ...
read it

HighOrder Langevin Diffusion Yields an Accelerated MCMC Algorithm
We propose a Markov chain Monte Carlo (MCMC) algorithm based on thirdor...
read it

Efficient sampling generation from explicit densities via Normalizing Flows
For many applications, such as computing the expected value of different...
read it

Rapid Convergence of the Unadjusted Langevin Algorithm: LogSobolev Suffices
We prove a convergence guarantee on the unadjusted Langevin algorithm fo...
read it

Smooth Strongly Convex Regression
Convex regression (CR) is the problem of fitting a convex function to a ...
read it
An Efficient Sampling Algorithm for Nonsmooth Composite Potentials
We consider the problem of sampling from a density of the form p(x) ∝(f(x) g(x)), where f: R^d →R is a smooth and strongly convex function and g: R^d →R is a convex and Lipschitz function. We propose a new algorithm based on the MetropolisHastings framework, and prove that it mixes to within TV distance ε of the target density in at most O(d log (d/ε)) iterations. This guarantee extends previous results on sampling from distributions with smooth log densities (g = 0) to the more general composite nonsmooth case, with the same mixing time up to a multiple of the condition number. Our method is based on a novel proximalbased proposal distribution that can be efficiently computed for a large class of nonsmooth functions g.
READ FULL TEXT
Comments
There are no comments yet.