Why Simple Quadrature is just as good as Monte Carlo

08/02/2019
by   Kevin Vanslette, et al.
0

We motive and calculate Newton-Cotes quadrature integration variance and compare it directly with Monte Carlo (MC) integration variance. We find an equivalence between deterministic quadrature sampling and random MC sampling by noting that random function sampling is statistically indistinguishable from a method that uses deterministic sampling on a randomly shuffled (permuted) function. We use this statistical equivalence to regularize the possible types of Bayesian priors one can put on quadrature integration such that they are informationally consistent with theoretical (frequentist) MC probabilities even if the integrand function or sampling isn't randomized in any way. This leads to the proof that composite rectangular and midpoint quadrature integrations have expected variances that are less than or equal to their corresponding theoretical MC integration variances. Separately, using Bayesian probability theory, we find that the theoretical standard deviations of the unbiased errors of some Newton-Cotes composite quadrature integrations improve over their worst case errors by an extra dimension independent factor ∝ N^-1/2, where N is the number of samples taken. This improves the Newton-Cotes error analysis and shows which quadrature methods can be implemented reliably in higher dimensions. This is validated in our simulations.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/24/2011

Using Supervised Learning to Improve Monte Carlo Integral Estimation

Monte Carlo (MC) techniques are often used to estimate integrals of a mu...
research
04/07/2021

Quasi-Newton Quasi-Monte Carlo for variational Bayes

Many machine learning problems optimize an objective that must be measur...
research
07/18/2021

Compressed Monte Carlo with application in particle filtering

Bayesian models have become very popular over the last years in several ...
research
05/20/2022

Posterior Refinement Improves Sample Efficiency in Bayesian Neural Networks

Monte Carlo (MC) integration is the de facto method for approximating th...
research
08/18/2020

On dropping the first Sobol' point

Quasi-Monte Carlo (QMC) points are a substitute for plain Monte Carlo (M...
research
02/19/2022

Graph Reparameterizations for Enabling 1000+ Monte Carlo Iterations in Bayesian Deep Neural Networks

Uncertainty estimation in deep models is essential in many real-world ap...
research
02/24/2022

A Timing Yield Model for SRAM Cells in Sub/Near-threshold Voltages Based on A Compact Drain Current Model

Sub/Near-threshold static random-access memory (SRAM) design is crucial ...

Please sign up or login with your details

Forgot password? Click here to reset