
Composite Logconcave Sampling with a Restricted Gaussian Oracle
We consider sampling from composite densities on ℝ^d of the form dπ(x) ∝...
read it

An Efficient Sampling Algorithm for Nonsmooth Composite Potentials
We consider the problem of sampling from a density of the form p(x) ∝(f...
read it

A Proximal Algorithm for Sampling from Nonsmooth Potentials
Markov chain Monte Carlo (MCMC) is an effective and dominant method to s...
read it

Logsmooth Gradient Concentration and Tighter Runtimes for Metropolized Hamiltonian Monte Carlo
We show that the gradient norm ∇ f(x) for x ∼(f(x)), where f is strongl...
read it

Weakly smooth Langevin Monte Carlo using pgeneralized Gaussian smoothing
Langevin Monte Carlo (LMC) is an iterative process for sampling from a d...
read it

On the Sample Complexity of Learning SumProduct Networks
SumProduct Networks (SPNs) can be regarded as a form of deep graphical ...
read it

The SamplingandLearning Framework: A Statistical View of Evolutionary Algorithms
Evolutionary algorithms (EAs), a large class of general purpose optimiza...
read it
Structured Logconcave Sampling with a Restricted Gaussian Oracle
We give algorithms for sampling several structured logconcave families to high accuracy. We further develop a reduction framework, inspired by proximal point methods in convex optimization, which bootstraps samplers for regularized densities to improve dependences on problem conditioning. A key ingredient in our framework is the notion of a "restricted Gaussian oracle" (RGO) for g: ℝ^d →ℝ, which is a sampler for distributions whose negative loglikelihood sums a quadratic and g. By combining our reduction framework with our new samplers, we obtain the following bounds for sampling structured distributions to total variation distance ϵ. For composite densities (f(x)  g(x)), where f has condition number κ and convex (but possibly nonsmooth) g admits an RGO, we obtain a mixing time of O(κ d log^3κ d/ϵ), matching the stateoftheart noncomposite bound; no composite samplers with better mixing than generalpurpose logconcave samplers were previously known. For logconcave finite sums (F(x)), where F(x) = 1/n∑_i ∈ [n] f_i(x) has condition number κ, we give a sampler querying O(n + κmax(d, √(nd))) gradient oracles to {f_i}_i ∈ [n]; no highaccuracy samplers with nontrivial gradient query complexity were previously known. For densities with condition number κ, we give an algorithm obtaining mixing time O(κ d log^2κ d/ϵ), improving the prior stateoftheart by a logarithmic factor with a significantly simpler analysis; we also show a zerothorder algorithm attains the same query complexity.
READ FULL TEXT
Comments
There are no comments yet.