
Relativistic Monte Carlo
Hamiltonian Monte Carlo (HMC) is a popular Markov chain Monte Carlo (MCM...
read it

Neural Network Gradient Hamiltonian Monte Carlo
Hamiltonian Monte Carlo is a widely used algorithm for sampling from pos...
read it

Towards Unifying Hamiltonian Monte Carlo and Slice Sampling
We unify slice sampling and Hamiltonian Monte Carlo (HMC) sampling, demo...
read it

Predictive Uncertainty in Large Scale Classification using Dropout  Stochastic Gradient Hamiltonian Monte Carlo
Predictive uncertainty is crucial for many computer vision tasks, from i...
read it

Bayesian cosmic density field inference from redshift space dark matter maps
We present a selfconsistent Bayesian formalism to sample the primordial...
read it

Implicit Hamiltonian Monte Carlo for Sampling Multiscale Distributions
Hamiltonian Monte Carlo (HMC) has been widely adopted in the statistics ...
read it

Sampling constrained probability distributions using Spherical Augmentation
Statistical models with constrained probability distributions are abunda...
read it
QuantumInspired Hamiltonian Monte Carlo for Bayesian Sampling
Hamiltonian Monte Carlo (HMC) is an efficient Bayesian sampling method that can make distant proposals in the parameter space by simulating a Hamiltonian dynamical system. Despite its popularity in machine learning and data science, HMC is inefficient to sample from spiky and multimodal distributions. Motivated by the energytime uncertainty relation from quantum mechanics, we propose a QuantumInspired Hamiltonian Monte Carlo algorithm (QHMC). This algorithm allows a particle to have a random mass with a probability distribution rather than a fixed mass. We prove the convergence property of QHMC in the spatial domain and in the time sequence. We further show why such a random mass can improve the performance when we sample a broad class of distributions. In order to handle the big training data sets in largescale machine learning, we develop a stochastic gradient version of QHMC using NoséHoover thermostat called QSGNHT, and we also provide theoretical justifications about its steadystate distributions. Finally in the experiments, we demonstrate the effectiveness of QHMC and QSGNHT on synthetic examples, bridge regression, image denoising and neural network pruning. The proposed QHMC and QSGNHT can indeed achieve much more stable and accurate sampling results on the test cases.
READ FULL TEXT
Comments
There are no comments yet.