From Smooth Wasserstein Distance to Dual Sobolev Norm: Empirical Approximation and Statistical Applications

01/11/2021
by   Sloan Nietert, et al.
0

Statistical distances, i.e., discrepancy measures between probability distributions, are ubiquitous in probability theory, statistics and machine learning. To combat the curse of dimensionality when estimating these distances from data, recent work has proposed smoothing out local irregularities in the measured distributions via convolution with a Gaussian kernel. Motivated by the scalability of the smooth framework to high dimensions, we conduct an in-depth study of the structural and statistical behavior of the Gaussian-smoothed p-Wasserstein distance 𝖶_p^(σ), for arbitrary p≥ 1. We start by showing that 𝖶_p^(σ) admits a metric structure that is topologically equivalent to classic 𝖶_p and is stable with respect to perturbations in σ. Moving to statistical questions, we explore the asymptotic properties of 𝖶_p^(σ)(μ̂_n,μ), where μ̂_n is the empirical distribution of n i.i.d. samples from μ. To that end, we prove that 𝖶_p^(σ) is controlled by a pth order smooth dual Sobolev norm 𝖽_p^(σ). Since 𝖽_p^(σ)(μ̂_n,μ) coincides with the supremum of an empirical process indexed by Gaussian-smoothed Sobolev functions, it lends itself well to analysis via empirical process theory. We derive the limit distribution of √(n)𝖽_p^(σ)(μ̂_n,μ) in all dimensions d, when μ is sub-Gaussian. Through the aforementioned bound, this implies a parametric empirical convergence rate of n^-1/2 for 𝖶_p^(σ), contrasting the n^-1/d rate for unsmoothed 𝖶_p when d ≥ 3. As applications, we provide asymptotic guarantees for two-sample testing and minimum distance estimation. When p=2, we further show that 𝖽_2^(σ) can be expressed as a maximum mean discrepancy.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset
Success!
Error Icon An error occurred

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro