DeepAI AI Chat
Log In Sign Up

Why Simple Quadrature is just as good as Monte Carlo

by   Kevin Vanslette, et al.
Center for Complex Engineering Systems

We motive and calculate Newton-Cotes quadrature integration variance and compare it directly with Monte Carlo (MC) integration variance. We find an equivalence between deterministic quadrature sampling and random MC sampling by noting that random function sampling is statistically indistinguishable from a method that uses deterministic sampling on a randomly shuffled (permuted) function. We use this statistical equivalence to regularize the possible types of Bayesian priors one can put on quadrature integration such that they are informationally consistent with theoretical (frequentist) MC probabilities even if the integrand function or sampling isn't randomized in any way. This leads to the proof that composite rectangular and midpoint quadrature integrations have expected variances that are less than or equal to their corresponding theoretical MC integration variances. Separately, using Bayesian probability theory, we find that the theoretical standard deviations of the unbiased errors of some Newton-Cotes composite quadrature integrations improve over their worst case errors by an extra dimension independent factor ∝ N^-1/2, where N is the number of samples taken. This improves the Newton-Cotes error analysis and shows which quadrature methods can be implemented reliably in higher dimensions. This is validated in our simulations.


page 1

page 2

page 3

page 4


Using Supervised Learning to Improve Monte Carlo Integral Estimation

Monte Carlo (MC) techniques are often used to estimate integrals of a mu...

Quasi-Newton Quasi-Monte Carlo for variational Bayes

Many machine learning problems optimize an objective that must be measur...

Compressed Monte Carlo with application in particle filtering

Bayesian models have become very popular over the last years in several ...

Posterior Refinement Improves Sample Efficiency in Bayesian Neural Networks

Monte Carlo (MC) integration is the de facto method for approximating th...

On dropping the first Sobol' point

Quasi-Monte Carlo (QMC) points are a substitute for plain Monte Carlo (M...

Graph Reparameterizations for Enabling 1000+ Monte Carlo Iterations in Bayesian Deep Neural Networks

Uncertainty estimation in deep models is essential in many real-world ap...

A Timing Yield Model for SRAM Cells in Sub/Near-threshold Voltages Based on A Compact Drain Current Model

Sub/Near-threshold static random-access memory (SRAM) design is crucial ...