Asymptotically Unbiased Generative Neural Sampling

10/29/2019
by   Kim A. Nicoli, et al.
0

We propose a general framework for the estimation of observables with generative neural samplers focusing on modern deep generative neural networks that provide an exact sampling probability. In this framework, we present asymptotically unbiased estimators for generic observables, including those that explicitly depend on the partition function such as free energy or entropy, and derive corresponding variance estimators. We demonstrate their practical applicability by numerical experiments for the 2d Ising model which highlight the superiority over existing methods. Our approach greatly enhances the applicability of generative neural samplers to real-world physical systems.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/27/2018

Solving Statistical Mechanics using Variational Autoregressive Networks

We propose a general framework for solving statistical mechanics of syst...
research
02/08/2018

Neural Network Renormalization Group

We present a variational renormalization group approach using deep gener...
research
10/18/2019

Temporal Network Sampling

Temporal networks representing a stream of timestamped edges are seeming...
research
10/02/2018

Unbiased estimation of log normalizing constants with applications to Bayesian cross-validation

Posterior distributions often feature intractable normalizing constants,...
research
12/06/2021

A General Framework for Debiasing in CTR Prediction

Most of the existing methods for debaising in click-through rate (CTR) p...
research
04/12/2023

Probability-Based Estimation

We develop a theory of estimation when in addition to a sample of n obse...
research
03/26/2018

On the utility of Metropolis-Hastings with asymmetric acceptance ratio

The Metropolis-Hastings algorithm allows one to sample asymptotically fr...

Please sign up or login with your details

Forgot password? Click here to reset