Dimensionally Tight Running Time Bounds for Second-Order Hamiltonian Monte Carlo

02/24/2018 ∙ by Oren Mangoubi, et al. ∙ 0

Hamiltonian Monte Carlo (HMC) is a widely deployed method to sample from a given high-dimensional distribution in Statistics and Machine learning. HMC is known to run very efficiently in practice and its second-order variant was conjectured to run in d^1/4 steps in 1988. Here we show that this conjecture is true when sampling from strongly log-concave target distributions that satisfy weak third-order regularity properties associated with the input data. This improves upon a recent result that shows that the number of steps of the second-order discretization of HMC grows like d^1/4 under the much stronger assumption that the distribution is separable and its first four Fréchet derivatives are bounded. Our result also compares favorably with the best available running time bounds for the class of strongly log-concave distributions, namely the current best bounds for both the overdamped and underdamped Langevin, and first-order HMC Algorithms, which all grow like d^1/2 with the dimension. Key to our result is a new regularity condition for the Hessian that may be of independent interest. The class of distributions that satisfy this condition are natural and include posterior distributions used in Bayesian logistic "ridge" regression.



There are no comments yet.


page 1

page 2

page 3

page 4

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.