DeepAI AI Chat
Log In Sign Up

Faster Hamiltonian Monte Carlo by Learning Leapfrog Scale

by   Changye Wu, et al.

Hamiltonian Monte Carlo samplers have become standard algorithms for MCMC implementations, as opposed to more basic versions, but they still require some amount of tuning and calibration. Exploiting the U-turn criterion of the NUTS algorithm (Hoffman and Gelman, 2014), we propose a version of HMC that relies on the distribution of the integration time of the associated leapfrog integrator. Using in addition the primal-dual averaging method for tuning the step size of the integrator, we achieve an essentially calibration free version of HMC. When compared with the original NUTS on several benchmarks, this algorithm exhibits a significantly improved efficiency.


page 16

page 18

page 19

page 20

page 21


Relativistic Monte Carlo

Hamiltonian Monte Carlo (HMC) is a popular Markov chain Monte Carlo (MCM...

Adaptive Tuning Of Hamiltonian Monte Carlo Within Sequential Monte Carlo

Sequential Monte Carlo (SMC) samplers form an attractive alternative to ...

On the convergence of dynamic implementations of Hamiltonian Monte Carlo and No U-Turn Samplers

There is substantial empirical evidence about the success of dynamic imp...

On the Geometric Ergodicity of Hamiltonian Monte Carlo

We establish general conditions under which Markov chains produced by th...

Sequential Monte Carlo applied to virtual flow meter calibration

Soft-sensors are gaining popularity due to their ability to provide esti...

Maximizing conditional entropy of Hamiltonian Monte Carlo sampler

The performance of Hamiltonian Monte Carlo (HMC) sampler depends critica...

Decentralized Bayesian Learning with Metropolis-Adjusted Hamiltonian Monte Carlo

Federated learning performed by a decentralized networks of agents is be...