Profile Entropy: A Fundamental Measure for the Learnability and Compressibility of Discrete Distributions

by   Yi Hao, et al.

The profile of a sample is the multiset of its symbol frequencies. We show that for samples of discrete distributions, profile entropy is a fundamental measure unifying the concepts of estimation, inference, and compression. Specifically, profile entropy a) determines the speed of estimating the distribution relative to the best natural estimator; b) characterizes the rate of inferring all symmetric properties compared with the best estimator over any label-invariant distribution collection; c) serves as the limit of profile compression, for which we derive optimal near-linear-time block and sequential algorithms. To further our understanding of profile entropy, we investigate its attributes, provide algorithms for approximating its value, and determine its magnitude for numerous structural distribution families.


page 1

page 2

page 3

page 4


Efficient Profile Maximum Likelihood for Universal Symmetric Property Estimation

Estimating symmetric properties of a distribution, e.g. support size, co...

The Optimality of Profile Maximum Likelihood in Estimating Sorted Discrete Distributions

A striking result of [Acharya et al. 2017] showed that to estimate symme...

Instance Based Approximations to Profile Maximum Likelihood

In this paper we provide a new efficient algorithm for approximately com...

Data Amplification: A Unified and Competitive Approach to Property Estimation

Estimating properties of discrete distributions is a fundamental problem...

On Modeling Profiles instead of Values

We consider the problem of estimating the distribution underlying an obs...

Extrapolating the profile of a finite population

We study a prototypical problem in empirical Bayes. Namely, consider a p...

Evolution of k-mer Frequencies and Entropy in Duplication and Substitution Mutation Systems

Genomic evolution can be viewed as string-editing processes driven by mu...