Deep Learning Hamiltonian Monte Carlo

05/07/2021
by   Sam Foreman, et al.
73

We generalize the Hamiltonian Monte Carlo algorithm with a stack of neural network layers and evaluate its ability to sample from different topologies in a two dimensional lattice gauge theory. We demonstrate that our model is able to successfully mix between modes of different topologies, significantly reducing the computational cost required to generated independent gauge field configurations. Our implementation is available at https://github.com/saforem2/l2hmc-qcd .

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/19/2018

Inference with Hamiltonian Sequential Monte Carlo Simulators

The paper proposes a new Monte-Carlo simulator combining the advantages ...
research
10/20/2017

Zero Variance and Hamiltonian Monte Carlo Methods in GARCH Models

In this paper, we develop Bayesian Hamiltonian Monte Carlo methods for i...
research
12/02/2021

HMC with Normalizing Flows

We propose using Normalizing Flows as a trainable kernel within the mole...
research
11/21/2020

Multi-experiment parameter identifiability of ODEs and model theory

Structural identifiability is a property of an ODE model with parameters...
research
10/11/2018

Stochastic Approximation Hamiltonian Monte Carlo

Recently, the Hamilton Monte Carlo (HMC) has become widespread as one of...
research
08/22/2018

Approximating Poker Probabilities with Deep Learning

Many poker systems, whether created with heuristics or machine learning,...
research
05/21/2018

Adaptive Monte-Carlo Optimization

The celebrated Monte Carlo method estimates a quantity that is expensive...

Please sign up or login with your details

Forgot password? Click here to reset