Machine Learning Trivializing Maps: A First Step Towards Understanding How Flow-Based Samplers Scale Up

12/31/2021
by   Luigi Del Debbio, et al.
7

A trivializing map is a field transformation whose Jacobian determinant exactly cancels the interaction terms in the action, providing a representation of the theory in terms of a deterministic transformation of a distribution from which sampling is trivial. Recently, a proof-of-principle study by Albergo, Kanwar and Shanahan [arXiv:1904.12072] demonstrated that approximations of trivializing maps can be `machine-learned' by a class of invertible, differentiable neural models called normalizing flows. By ensuring that the Jacobian determinant can be computed efficiently, asymptotically exact sampling from the theory of interest can be performed by drawing samples from a simple distribution and passing them through the network. From a theoretical perspective, this approach has the potential to become more efficient than traditional Markov Chain Monte Carlo sampling techniques, where autocorrelations severely diminish the sampling efficiency as one approaches the continuum limit. A major caveat is that it is not yet understood how the size of models and the cost of training them is expected to scale. As a first step, we have conducted an exploratory scaling study using two-dimensional ϕ^4 with up to 20^2 lattice sites. Although the scope of our study is limited to a particular model architecture and training algorithm, initial results paint an interesting picture in which training costs grow very quickly indeed. We describe a candidate explanation for the poor scaling, and outline our intentions to clarify the situation in future work.

READ FULL TEXT
research
04/26/2019

Flow-based generative models for Markov chain Monte Carlo in lattice field theory

A Markov chain update scheme using a machine-learned flow-based generati...
research
11/14/2022

Aspects of scaling and scalability for flow-based sampling of lattice QCD

Recent applications of machine-learned normalizing flows to sampling in ...
research
02/09/2023

On Sampling with Approximate Transport Maps

Transport maps can ease the sampling of distributions with non-trivial g...
research
02/02/2022

Gradient estimators for normalising flows

Recently a machine learning approach to Monte-Carlo simulations called N...
research
06/03/2021

Machine Learning and Variational Algorithms for Lattice Field Theory

In lattice quantum field theory studies, parameters defining the lattice...
research
05/20/2023

Normalizing flow sampling with Langevin dynamics in the latent space

Normalizing flows (NF) use a continuous generator to map a simple latent...
research
01/04/2023

Generative models for scalar field theories: how to deal with poor scaling?

Generative models, such as the method of normalizing flows, have been su...

Please sign up or login with your details

Forgot password? Click here to reset