The Optimality of Profile Maximum Likelihood in Estimating Sorted Discrete Distributions
A striking result of [Acharya et al. 2017] showed that to estimate symmetric properties of discrete distributions, plugging in the distribution that maximizes the likelihood of observed multiset of frequencies, also known as the profile maximum likelihood (PML) distribution, is competitive compared with any estimators regardless of the symmetric property. Specifically, given n observations from the discrete distribution, if some estimator incurs an error ε with probability at most δ, then plugging in the PML distribution incurs an error 2ε with probability at most δ(3√(n)). In this paper, we strengthen the above result and show that using a careful chaining argument, the error probability can be reduced to δ^1-c(c'n^1/3+c) for arbitrarily small constants c>0 and some constant c'>0. In particular, we show that the PML distribution is an optimal estimator of the sorted true distribution: it is ε-close in sorted ℓ_1 distance to the true distribution with support size k for any n=Ω(k/(ε^2log k)) and ε≫ n^-1/3, which are the information-theoretically optimal sample complexity and the largest error regime where the classical empirical distribution is sub-optimal, respectively. In order to strengthen the analysis of the PML, a key ingredient is to employ novel "continuity" properties of the PML distributions and construct a chain of suitable quantized PMLs, or "coverings". We also construct a novel approximation-based estimator for the sorted distribution with a near-optimal concentration property without any sample splitting, where as a byproduct we obtain tight trade-offs between the polynomial approximation error and the maximum magnitude of coefficients in the Poisson approximation of 1-Lipschitz functions.
READ FULL TEXT