DeepAI AI Chat
Log In Sign Up

Discrepancy Bounds for a Class of Negatively Dependent Random Points Including Latin Hypercube Samples

by   Michael Gnewuch, et al.

We introduce a class of γ-negatively dependent random samples. We prove that this class includes, apart from Monte Carlo samples, in particular Latin hypercube samples and Latin hypercube samples padded by Monte Carlo. For a γ-negatively dependent N-point sample in dimension d we provide probabilistic upper bounds for its star discrepancy with explicitly stated dependence on N, d, and γ. These bounds generalize the probabilistic bounds for Monte Carlo samples from [Heinrich et al., Acta Arith. 96 (2001), 279–302] and [C. Aistleitner, J. Complexity 27 (2011), 531–540], and they are optimal for Monte Carlo and Latin hypercube samples. In the special case of Monte Carlo samples the constants that appear in our bounds improve substantially on the constants presented in the latter paper and in [C. Aistleitner, M. T. Hofer, Math. Comp. 83 (2014), 1373–1381].


page 1

page 2

page 3

page 4


On Negatively Dependent Sampling Schemes, Variance Reduction, and Probabilistic Upper Discrepancy Bounds

We study some notions of negative dependence of a sampling scheme that c...

Sharper convergence bounds of Monte Carlo Rademacher Averages through Self-Bounding functions

We derive sharper probabilistic concentration bounds for the Monte Carlo...

Jittering Samples using a kd-Tree Stratification

Monte Carlo sampling techniques are used to estimate high-dimensional in...

Bounds of star discrepancy for HSFC-based sampling

In this paper, we focus on estimating the probabilistic upper bounds of ...

On Negative Dependence Properties of Latin Hypercube Samples and Scrambled Nets

We study the notion of γ-negative dependence of random variables. This n...

Statistical errors in Monte Carlo-based inference for random elements

Monte Carlo simulation is useful to compute or estimate expected functio...