Bounds on Wasserstein distances between continuous distributions using independent samples

03/22/2022
by   Tamás Papp, et al.
0

The plug-in estimator of the Wasserstein distance is known to be conservative, however its usefulness is severely limited when the distributions are similar as its bias does not decay to zero with the true Wasserstein distance. We propose a linear combination of plug-in estimators for the squared 2-Wasserstein distance with a reduced bias that decays to zero with the true distance. The new estimator is provably conservative provided one distribution is appropriately overdispersed with respect the other, and is unbiased when the distributions are equal. We apply it to approximately bound from above the 2-Wasserstein distance between the target and current distribution in Markov chain Monte Carlo, running multiple identically distributed chains which start, and remain, overdispersed with respect to the target. Our bound consistently outperforms the current state-of-the-art bound, which uses coupling, improving mixing time bounds by up to an order of magnitude.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/28/2023

On the 1-Wasserstein Distance between Location-Scale Distributions and the Effect of Differential Privacy

We provide an exact expressions for the 1-Wasserstein distance between i...
research
08/13/2022

Finite Sample Complexity of Sequential Monte Carlo Estimators on Multimodal Target Distributions

We prove finite sample complexities for sequential Monte Carlo (SMC) alg...
research
05/23/2019

Estimating Convergence of Markov chains with L-Lag Couplings

Markov chain Monte Carlo (MCMC) methods generate samples that are asympt...
research
03/09/2019

Orthogonal Estimation of Wasserstein Distances

Wasserstein distances are increasingly used in a wide variety of applica...
research
09/14/2023

A minimum Wasserstein distance approach to Fisher's combination of independent discrete p-values

This paper introduces a comprehensive framework to adjust a discrete tes...
research
06/24/2023

Smoothed f-Divergence Distributionally Robust Optimization: Exponential Rate Efficiency and Complexity-Free Calibration

In data-driven optimization, sample average approximation is known to su...

Please sign up or login with your details

Forgot password? Click here to reset