DeepAI
Log In Sign Up

Sharp Convergence Rates for Empirical Optimal Transport with Smooth Costs

06/24/2021
by   Tudor Manole, et al.
0

We revisit the question of characterizing the convergence rate of plug-in estimators of optimal transport costs. It is well known that an empirical measure comprising independent samples from an absolutely continuous distribution on ℝ^d converges to that distribution at the rate n^-1/d in Wasserstein distance, which can be used to prove that plug-in estimators of many optimal transport costs converge at this same rate. However, we show that when the cost is smooth, this analysis is loose: plug-in estimators based on empirical measures converge quadratically faster, at the rate n^-2/d. As a corollary, we show that the Wasserstein distance between two distributions is significantly easier to estimate when the measures are far apart. We also prove lower bounds, showing not only that our analysis of the plug-in estimator is tight, but also that no other estimator can enjoy significantly faster rates of convergence uniformly over all pairs of measures. Our proofs rely on empirical process theory arguments based on tight control of L^2 covering numbers for locally Lipschitz and semi-concave functions. As a byproduct of our proofs, we derive L^∞ estimates on the displacement induced by the optimal coupling between any two measures satisfying suitable moment conditions, for a wide range of cost functions.

READ FULL TEXT

page 1

page 2

page 3

page 4

07/26/2021

Plugin Estimation of Smooth Optimal Transport Maps

We analyze a number of natural estimators for the optimal transport map ...
02/21/2022

Empirical Optimal Transport between Different Measures Adapts to Lower Complexity

The empirical optimal transport (OT) cost between two probability measur...
04/19/2022

An improved central limit theorem and fast convergence rates for entropic transportation costs

We prove a central limit theorem for the entropic transportation cost be...
06/27/2018

Uncoupled isotonic regression via minimum Wasserstein deconvolution

Isotonic regression is a standard problem in shape-constrained estimatio...
07/26/2022

The derivatives of Sinkhorn-Knopp converge

We show that the derivatives of the Sinkhorn-Knopp algorithm, or iterati...
09/19/2019

On the Wasserstein Distance between Classical Sequences and the Lebesgue Measure

We discuss the classical problem of measuring the regularity of distribu...
05/18/2018

Wasserstein Coresets for Lipschitz Costs

Sparsification is becoming more and more relevant with the proliferation...