DeepAI AI Chat
Log In Sign Up

Notes on optimal approximations for importance sampling

by   Jacopo Pantaleoni, et al.

In this manuscript, we derive optimal conditions for building function approximations that minimize variance when used as importance sampling estimators for Monte Carlo integration problems. Particularly, we study the problem of finding the optimal projection g of an integrand f onto certain classes of piecewise constant functions, in order to minimize the variance of the unbiased importance sampling estimator E_g[f/g], as well as the related problem of finding optimal mixture weights to approximate and importance sample a target mixture distribution f = ∑_i α_i f_i with components f_i in a family F, through a corresponding mixture of importance sampling densities g_i that are only approximately proportional to f_i. We further show that in both cases the optimal projection is different from the commonly used ℓ_1 projection, and provide an intuitive explanation for the difference.


page 1

page 2

page 3

page 4


Refractor Importance Sampling

In this paper we introduce Refractor Importance Sampling (RIS), an impro...

Path Throughput Importance Weights

Many Monte Carlo light transport simulations use multiple importance sam...

Annealed Importance Sampling with q-Paths

Annealed importance sampling (AIS) is the gold standard for estimating p...

Optimal Off-Policy Evaluation from Multiple Logging Policies

We study off-policy evaluation (OPE) from multiple logging policies, eac...

A unified view of likelihood ratio and reparameterization gradients and an optimal importance sampling scheme

Reparameterization (RP) and likelihood ratio (LR) gradient estimators ar...

Adaptive importance sampling based on fault tree analysis for piecewise deterministic Markov process

Piecewise deterministic Markov processes (PDMPs) can be used to model co...