Structured Logconcave Sampling with a Restricted Gaussian Oracle

10/07/2020 ∙ by Yin Tat Lee, et al. ∙ 0

We give algorithms for sampling several structured logconcave families to high accuracy. We further develop a reduction framework, inspired by proximal point methods in convex optimization, which bootstraps samplers for regularized densities to improve dependences on problem conditioning. A key ingredient in our framework is the notion of a "restricted Gaussian oracle" (RGO) for g: ℝ^d →ℝ, which is a sampler for distributions whose negative log-likelihood sums a quadratic and g. By combining our reduction framework with our new samplers, we obtain the following bounds for sampling structured distributions to total variation distance ϵ. For composite densities (-f(x) - g(x)), where f has condition number κ and convex (but possibly non-smooth) g admits an RGO, we obtain a mixing time of O(κ d log^3κ d/ϵ), matching the state-of-the-art non-composite bound; no composite samplers with better mixing than general-purpose logconcave samplers were previously known. For logconcave finite sums (-F(x)), where F(x) = 1/n∑_i ∈ [n] f_i(x) has condition number κ, we give a sampler querying O(n + κmax(d, √(nd))) gradient oracles to {f_i}_i ∈ [n]; no high-accuracy samplers with nontrivial gradient query complexity were previously known. For densities with condition number κ, we give an algorithm obtaining mixing time O(κ d log^2κ d/ϵ), improving the prior state-of-the-art by a logarithmic factor with a significantly simpler analysis; we also show a zeroth-order algorithm attains the same query complexity.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.