Estimating Certain Integral Probability Metric (IPM) is as Hard as Estimating under the IPM

by   Tengyuan Liang, et al.

We study the minimax optimal rates for estimating a range of Integral Probability Metrics (IPMs) between two unknown probability measures, based on n independent samples from them. Curiously, we show that estimating the IPM itself between probability measures, is not significantly easier than estimating the probability measures under the IPM. We prove that the minimax optimal rates for these two problems are multiplicatively equivalent, up to a loglog (n)/log (n) factor.



page 1

page 2

page 3

page 4


On the Minimax Optimality of Estimating the Wasserstein Metric

We study the minimax optimal rate for estimating the Wasserstein-1 metri...

Minimax rates in outlier-robust estimation of discrete models

We consider the problem of estimating the probability distribution of a ...

Bounding the expectation of the supremum of empirical processes indexed by Hölder classes

We obtain upper bounds on the expectation of the supremum of empirical p...

Towards Optimal Estimation of Bivariate Isotonic Matrices with Unknown Permutations

Many applications, including rank aggregation, crowd-labeling, and graph...

Minimax bounds for estimating multivariate Gaussian location mixtures

We prove minimax bounds for estimating Gaussian location mixtures on ℝ^d...

Introduction to Neutrosophic Measure, Neutrosophic Integral, and Neutrosophic Probability

In this paper, we introduce for the first time the notions of neutrosoph...

Learning with Stochastic Orders

Learning high-dimensional distributions is often done with explicit like...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.