Oracle lower bounds for stochastic gradient sampling algorithms

02/01/2020
by   Niladri S. Chatterji, et al.
0

We consider the problem of sampling from a strongly log-concave density in R^d, and prove an information theoretic lower bound on the number of stochastic gradient queries of the log density needed. Several popular sampling algorithms (including many Markov chain Monte Carlo methods) operate by using stochastic gradients of the log density to generate a sample; our results establish an information theoretic limit for all these algorithms. We show that for every algorithm, there exists a well-conditioned strongly log-concave target density for which the distribution of points generated by the algorithm would be at least ε away from the target in total variation distance if the number of gradient queries is less than Ω(σ^2 d/ε^2), where σ^2 d is the variance of the stochastic gradient. Our lower bound follows by combining the ideas of Le Cam deficiency routinely used in the comparison of statistical experiments along with standard information theoretic tools used in lower bounding Bayes risk functions. To the best of our knowledge our results provide the first nontrivial dimension-dependent lower bound for this problem.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/19/2020

Faster Convergence of Stochastic Gradient Langevin Dynamics for Non-Log-Concave Sampling

We establish a new convergence analysis of stochastic gradient Langevin ...
research
04/05/2023

Query lower bounds for log-concave sampling

Log-concave sampling has witnessed remarkable algorithmic advances in re...
research
02/13/2018

Stochastic Variance-Reduced Hamilton Monte Carlo Methods

We propose a fast stochastic Hamilton Monte Carlo (HMC) method, for samp...
research
11/08/2019

Estimating Normalizing Constants for Log-Concave Distributions: Algorithms and Lower Bounds

Estimating the normalizing constant of an unnormalized probability distr...
research
10/05/2022

Fisher information lower bounds for sampling

We prove two lower bounds for the complexity of non-log-concave sampling...
research
03/31/2020

Information-Theoretic Lower Bounds for Zero-Order Stochastic Gradient Estimation

In this paper we analyze the necessary number of samples to estimate the...
research
11/22/2022

Support Size Estimation: The Power of Conditioning

We consider the problem of estimating the support size of a distribution...

Please sign up or login with your details

Forgot password? Click here to reset