Universal Densities Exist for Every Finite Reference Measure
As it is known, universal codes, which estimate the entropy rate consistently, exist for any stationary ergodic source over a finite alphabet but not over a countably infinite one. We cast the problem of universal coding into the problem of universal densities with respect to a given reference measure on a countably generated measurable space, examples being the counting measure or the Lebesgue measure. We show that universal densities, which estimate the differential entropy rate consistently, exist if the reference measure is finite, which disproves that the assumption of a finite alphabet is necessary in general. To exhibit a universal density, we combine the prediction by partial matching (PPM) code with the non-parametric differential (NPD) entropy rate estimator, putting a prior both over all Markov orders and all quantization levels. The proof of universality applies Barron's asymptotic equipartition for densities and continuity of f-divergences for filtrations. As an application, we demonstrate that any universal density induces a strongly consistent Cesàro mean estimator of the conditional density given an infinite past, which solves the problem of universal prediction with the 0-1 loss for a countable alphabet, by the way. We also show that there exists a strongly consistent entropy rate estimator with respect to the Lebesgue measure in the class of stationary ergodic Gaussian processes.
READ FULL TEXT