A Class of Dimensionality-free Metrics for the Convergence of Empirical Measures

04/24/2021
by   Jiequn Han, et al.
0

This paper concerns the convergence of empirical measures in high dimensions. We propose a new class of metrics and show that under such metrics, the convergence is free of the curse of dimensionality (CoD). Such a feature is critical for high-dimensional analysis and stands in contrast to classical metrics (e.g., the Wasserstein distance). The proposed metrics originate from the maximum mean discrepancy, which we generalize by proposing specific criteria for selecting test function spaces to guarantee the property of being free of CoD. Therefore, we call this class of metrics the generalized maximum mean discrepancy (GMMD). Examples of the selected test function spaces include the reproducing kernel Hilbert space, Barron space, and flow-induced function spaces. Three applications of the proposed metrics are presented: 1. The convergence of empirical measure in the case of random variables; 2. The convergence of n-particle system to the solution to McKean-Vlasov stochastic differential equation; 3. The construction of an ε-Nash equilibrium for a homogeneous n-player game by its mean-field limit. As a byproduct, we prove that, given a distribution close to the target distribution measured by GMMD and a certain representation of the target distribution, we can generate a distribution close to the target one in terms of the Wasserstein distance and relative entropy. Overall, we show that the proposed class of metrics is a powerful tool to analyze the convergence of empirical measures in high dimensions without CoD.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/15/2020

Entropic regularization of Wasserstein distance between infinite-dimensional Gaussian measures and Gaussian processes

This work studies the entropic regularization formulation of the 2-Wasse...
research
06/11/2019

Maximum Mean Discrepancy Gradient Flow

We construct a Wasserstein gradient flow of the maximum mean discrepancy...
research
04/19/2020

A Universal Approximation Theorem of Deep Neural Networks for Expressing Distributions

This paper studies the universal approximation property of deep neural n...
research
01/05/2021

Convergence and finite sample approximations of entropic regularized Wasserstein distances in Gaussian and RKHS settings

This work studies the convergence and finite sample approximations of en...
research
06/30/2020

Constructive Universal High-Dimensional Distribution Generation through Deep ReLU Networks

We present an explicit deep neural network construction that transforms ...
research
05/13/2020

The Equivalence of Fourier-based and Wasserstein Metrics on Imaging Problems

We investigate properties of some extensions of a class of Fourier-based...
research
02/16/2021

From Majorization to Interpolation: Distributionally Robust Learning using Kernel Smoothing

We study the function approximation aspect of distributionally robust op...

Please sign up or login with your details

Forgot password? Click here to reset