Learning Optimal Flows for Non-Equilibrium Importance Sampling

06/20/2022
by   Yu Cao, et al.
11

Many applications in computational sciences and statistical inference require the computation of expectations with respect to complex high-dimensional distributions with unknown normalization constants, as well as the estimation of these constants. Here we develop a method to perform these calculations based on generating samples from a simple base distribution, transporting them along the flow generated by a velocity field, and performing averages along these flowlines. This non-equilibrium importance sampling (NEIS) strategy is straightforward to implement, and can be used for calculations with arbitrary target distributions. On the theory side we discuss how to tailor the velocity field to the target and establish general conditions under which the proposed estimator is a perfect estimator, with zero-variance. We also draw connections between NEIS and approaches based on mapping a base distribution onto a target via a transport map. On the computational side we show how to use deep learning to represent the velocity field by a neural network and train it towards the zero variance optimum. These results are illustrated numerically on high dimensional examples, where we show that training the velocity field can decrease the variance of the NEIS estimator by up to 6 order of magnitude compared to a vanilla estimator. We also show that NEIS performs better on these examples than Neal's annealed importance sampling (AIS).

READ FULL TEXT
research
07/13/2022

BR-SNIS: Bias Reduced Self-Normalized Importance Sampling

Importance Sampling (IS) is a method for approximating expectations unde...
research
11/22/2021

Bootstrap Your Flow

Normalizing flows are flexible, parameterized distributions that can be ...
research
10/25/2022

BSDF Importance Baking: A Lightweight Neural Solution to Importance Sampling General Parametric BSDFs

Parametric Bidirectional Scattering Distribution Functions (BSDFs) are p...
research
09/05/2022

Deep importance sampling using tensor-trains with application to a priori and a posteriori rare event estimation

We propose a deep importance sampling method that is suitable for estima...
research
03/02/2018

Not All Samples Are Created Equal: Deep Learning with Importance Sampling

Deep neural network training spends most of the computation on examples ...
research
06/30/2023

Learned harmonic mean estimation of the marginal likelihood with normalizing flows

Computing the marginal likelihood (also called the Bayesian model eviden...
research
02/12/2020

Targeted free energy estimation via learned mappings

Free energy perturbation (FEP) was proposed by Zwanzig more than six dec...

Please sign up or login with your details

Forgot password? Click here to reset