Bounding distributional errors via density ratios

05/08/2019
by   Lutz Duembgen, et al.
0

We present some new and explicit error bounds for the approximation of distributions. The approximation error is quantified by the maximal density ratio of the distribution Q to be approximated and its proxy P. This non-symmetric measure is more informative than and implies bounds for the total variation distance. Explicit approximation problems include, among others, hypergeometric by binomial distributions, and (generalized) binomial by Poisson distributions. In many cases we provide both upper and (matching) lower bounds.

READ FULL TEXT
research
12/12/2022

Lower Bounds for the Total Variation Distance Given Means and Variances of Distributions

For arbitrary two probability measures on real d-space with given means ...
research
04/04/2019

Taming the Knight's Tour: Minimizing Turns and Crossings

We introduce two new metrics of simplicity for knight's tours: the numbe...
research
02/07/2020

Bounds on the Information Divergence for Hypergeometric Distributions

The hypergeometric distributions have many important applications, but t...
research
06/29/2018

Guaranteed Deterministic Bounds on the Total Variation Distance between Univariate Mixtures

The total variation distance is a core statistical distance between prob...
research
06/09/2022

Optimal SQ Lower Bounds for Robustly Learning Discrete Product Distributions and Ising Models

We establish optimal Statistical Query (SQ) lower bounds for robustly le...
research
05/12/2020

High Probability Lower Bounds for the Total Variation Distance

The statistics and machine learning communities have recently seen a gro...
research
06/09/2019

The Implicit Metropolis-Hastings Algorithm

Recent works propose using the discriminator of a GAN to filter out unre...

Please sign up or login with your details

Forgot password? Click here to reset