Jittering Samples using a kd-Tree Stratification

02/17/2020
by   Alexandros D. Keros, et al.
0

Monte Carlo sampling techniques are used to estimate high-dimensional integrals that model the physics of light transport in virtual scenes for computer graphics applications. These methods rely on the law of large numbers to estimate expectations via simulation, typically resulting in slow convergence. Their errors usually manifest as undesirable grain in the pictures generated by image synthesis algorithms. It is well known that these errors diminish when the samples are chosen appropriately. A well known technique for reducing error operates by subdividing the integration domain, estimating integrals in each stratum and aggregating these values into a stratified sampling estimate. Naïve methods for stratification, based on a lattice (grid) are known to improve the convergence rate of Monte Carlo, but require samples that grow exponentially with the dimensionality of the domain. We propose a simple stratification scheme for d dimensional hypercubes using the kd-tree data structure. Our scheme enables the generation of an arbitrary number of equal volume partitions of the rectangular domain, and n samples can be generated in O(n) time. Since we do not always need to explicitly build a kd-tree, we provide a simple procedure that allows the sample set to be drawn fully in parallel without any precomputation or storage, speeding up sampling to O(log n) time per sample when executed on n cores. If the tree is implicitly precomputed (O(n) storage) the parallelised run time reduces to O(1) on n cores. In addition to these benefits, we provide an upper bound on the worst case star-discrepancy for n samples matching that of lattice-based sampling strategies, which occur as a special case of our proposed method. We use a number of quantitative and qualitative tests to compare our method against state of the art samplers for image synthesis.

READ FULL TEXT

page 15

page 16

page 22

page 23

page 24

research
02/06/2021

Discrepancy Bounds for a Class of Negatively Dependent Random Points Including Latin Hypercube Samples

We introduce a class of γ-negatively dependent random samples. We prove ...
research
08/24/2011

Using Supervised Learning to Improve Monte Carlo Integral Estimation

Monte Carlo (MC) techniques are often used to estimate integrals of a mu...
research
05/22/2020

Model Evidence with Fast Tree Based Quadrature

High dimensional integration is essential to many areas of science, rang...
research
06/18/2018

End-to-end Sampling Patterns

Sample patterns have many uses in Computer Graphics, ranging from proced...
research
08/19/2023

Transporting Higher-Order Quadrature Rules: Quasi-Monte Carlo Points and Sparse Grids for Mixture Distributions

Integration against, and hence sampling from, high-dimensional probabili...
research
10/27/2022

Quasi-Monte Carlo finite element approximation of the Navier-Stokes equations with initial data modeled by log-normal random fields

In this paper, we analyze the numerical approximation of the Navier-Stok...
research
08/08/2018

Lattice Studies of Gerrymandering Strategies

We propose three novel gerrymandering algorithms which incorporate the s...

Please sign up or login with your details

Forgot password? Click here to reset