DeepAI AI Chat
Log In Sign Up

Notes on optimal approximations for importance sampling

07/26/2017
by   Jacopo Pantaleoni, et al.
0

In this manuscript, we derive optimal conditions for building function approximations that minimize variance when used as importance sampling estimators for Monte Carlo integration problems. Particularly, we study the problem of finding the optimal projection g of an integrand f onto certain classes of piecewise constant functions, in order to minimize the variance of the unbiased importance sampling estimator E_g[f/g], as well as the related problem of finding optimal mixture weights to approximate and importance sample a target mixture distribution f = ∑_i α_i f_i with components f_i in a family F, through a corresponding mixture of importance sampling densities g_i that are only approximately proportional to f_i. We further show that in both cases the optimal projection is different from the commonly used ℓ_1 projection, and provide an intuitive explanation for the difference.

READ FULL TEXT

page 1

page 2

page 3

page 4

06/13/2012

Refractor Importance Sampling

In this paper we introduce Refractor Importance Sampling (RIS), an impro...
06/04/2018

Path Throughput Importance Weights

Many Monte Carlo light transport simulations use multiple importance sam...
12/14/2020

Annealed Importance Sampling with q-Paths

Annealed importance sampling (AIS) is the gold standard for estimating p...
10/21/2020

Optimal Off-Policy Evaluation from Multiple Logging Policies

We study off-policy evaluation (OPE) from multiple logging policies, eac...
10/14/2019

A unified view of likelihood ratio and reparameterization gradients and an optimal importance sampling scheme

Reparameterization (RP) and likelihood ratio (LR) gradient estimators ar...
09/17/2022

Adaptive importance sampling based on fault tree analysis for piecewise deterministic Markov process

Piecewise deterministic Markov processes (PDMPs) can be used to model co...