Neural Network Renormalization Group

02/08/2018
by   Shuo-Hui Li, et al.
0

We present a variational renormalization group approach using deep generative model composed of bijectors. The model can learn hierarchical transformations between physical variables and renormalized collective variables. It can directly generate statistically independent physical configurations by iterative refinement at various length scales. The generative model has an exact and tractable likelihood, which provides renormalized energy function of the collective variables and supports unbiased rejection sampling of the physical variables. To train the neural network, we employ probability density distillation, in which the training loss is a variational upper bound of the physical free energy. The approach could be useful for automatically identifying collective variables and effective field theories.

READ FULL TEXT

page 4

page 10

page 11

page 12

page 13

research
09/30/2019

Neural Canonical Transformation with Symplectic Flows

Canonical transformation plays a fundamental role in simplifying and sol...
research
10/29/2019

Asymptotically Unbiased Generative Neural Sampling

We propose a general framework for the estimation of observables with ge...
research
09/27/2018

Solving Statistical Mechanics using Variational Autoregressive Networks

We propose a general framework for solving statistical mechanics of syst...
research
07/11/2018

VFunc: a Deep Generative Model for Functions

We introduce a deep generative model for functions. Our model provides a...
research
12/07/2020

Multitask machine learning of collective variables for enhanced sampling of rare events

Computing accurate reaction rates is a central challenge in computationa...
research
04/22/2021

Chasing Collective Variables using Autoencoders and biased trajectories

In the last decades, free energy biasing methods have proven to be power...

Please sign up or login with your details

Forgot password? Click here to reset