HMC with Normalizing Flows

12/02/2021
by   Sam Foreman, et al.
19

We propose using Normalizing Flows as a trainable kernel within the molecular dynamics update of Hamiltonian Monte Carlo (HMC). By learning (invertible) transformations that simplify our dynamics, we can outperform traditional methods at generating independent configurations. We show that, using a carefully constructed network architecture, our approach can be easily scaled to large lattice volumes with minimal retraining effort. The source code for our implementation is publicly available online at https://github.com/nftqcd/fthmc.

READ FULL TEXT

page 1

page 6

research
12/02/2021

LeapfrogLayers: A Trainable Framework for Effective Topological Sampling

We introduce LeapfrogLayers, an invertible neural network architecture t...
research
05/07/2021

Deep Learning Hamiltonian Monte Carlo

We generalize the Hamiltonian Monte Carlo algorithm with a stack of neur...
research
02/14/2023

Do Deep Learning Methods Really Perform Better in Molecular Conformation Generation?

Molecular conformation generation (MCG) is a fundamental and important p...
research
06/08/2023

Efficient and Equivariant Graph Networks for Predicting Quantum Hamiltonian

We consider the prediction of the Hamiltonian matrix, which finds use in...
research
09/29/2020

Couplings for Andersen Dynamics

Andersen dynamics is a standard method for molecular simulations, and a ...
research
10/10/2022

PyHopper – Hyperparameter optimization

Hyperparameter tuning is a fundamental aspect of machine learning resear...
research
04/19/2023

AMT: All-Pairs Multi-Field Transforms for Efficient Frame Interpolation

We present All-Pairs Multi-Field Transforms (AMT), a new network archite...

Please sign up or login with your details

Forgot password? Click here to reset