Limit Distribution for Smooth Total Variation and χ^2-Divergence in High Dimensions

02/03/2020
by   Ziv Goldfeld, et al.
0

Statistical divergences are ubiquitous in machine learning as tools for measuring distances between probability distributions. As data science inherently relies on approximating distributions from samples, we consider empirical approximation under two central f-divergences: the total variation (TV) distance and the χ^2-divergence. To circumvent the sensitivity of these divergences to support mismatch, the framework of Gaussian smoothing is adopted. We study the limit distribution of √(n)δ_TV(P_n∗N,P∗N) and nχ^2(P_n∗NP∗N), where P_n is the empirical measure based on n independently and identically distributed (i.i.d.) samples from P, N:=N(0,σ^2I_d), and ∗ stands for convolution. In arbitrary dimension, the limit distributions are characterized in terms of Gaussian process on R^d with covariance operator that dependent on P and the isotropic Gaussian density of parameter σ. This, in turn, implies optimality of the n^-1/2 expected value convergence rates recently derived for δ_TV(P_n∗N,P∗N) and χ^2(P_n∗NP∗N). These strong statistical guarantees promote empirical approximation under Gaussian smoothing as a powerful framework for learning and inference based on high-dimensional data.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/03/2020

Limit Distribution Theory for Smooth Wasserstein Distance with Applications to Generative Modeling

The 1-Wasserstein distance (W_1) is a popular proximity measure between ...
research
05/30/2019

Sinkhorn Barycenters with Free Support via Frank-Wolfe Algorithm

We present a novel algorithm to estimate the barycenter of arbitrary pro...
research
01/11/2021

From Smooth Wasserstein Distance to Dual Sobolev Norm: Empirical Approximation and Statistical Applications

Statistical distances, i.e., discrepancy measures between probability di...
research
07/28/2021

Limit Distribution Theory for the Smooth 1-Wasserstein Distance with Applications

The smooth 1-Wasserstein distance (SWD) W_1^σ was recently proposed as a...
research
05/30/2019

Convergence of Smoothed Empirical Measures with Applications to Entropy Estimation

This paper studies convergence of empirical measures smoothed by a Gauss...
research
11/21/2022

Limit distribution theory for f-Divergences

f-divergences, which quantify discrepancy between probability distributi...
research
03/11/2021

Non-Asymptotic Performance Guarantees for Neural Estimation of 𝖿-Divergences

Statistical distances (SDs), which quantify the dissimilarity between pr...

Please sign up or login with your details

Forgot password? Click here to reset