Estimation of Entropy in Constant Space with Improved Sample Complexity

05/19/2022
by   Maryam Aliakbarpour, et al.
0

Recent work of Acharya et al. (NeurIPS 2019) showed how to estimate the entropy of a distribution 𝒟 over an alphabet of size k up to ±ϵ additive error by streaming over (k/ϵ^3) ·polylog(1/ϵ) i.i.d. samples and using only O(1) words of memory. In this work, we give a new constant memory scheme that reduces the sample complexity to (k/ϵ^2)·polylog(1/ϵ). We conjecture that this is optimal up to polylog(1/ϵ) factors.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/18/2019

Estimating Entropy of Distributions in Constant Space

We consider the task of estimating the entropy of k-ary distributions fr...
research
03/09/2022

Metric Entropy Duality and the Sample Complexity of Outcome Indistinguishability

We give the first sample complexity characterizations for outcome indist...
research
08/02/2022

Bias Reduction for Sum Estimation

In classical statistics and distribution testing, it is often assumed th...
research
01/15/2019

Optimistic optimization of a Brownian

We address the problem of optimizing a Brownian motion. We consider a (r...
research
02/21/2020

Practical Estimation of Renyi Entropy

Entropy Estimation is an important problem with many applications in cry...
research
03/16/2020

Discrete-valued Preference Estimation with Graph Side Information

Incorporating graph side information into recommender systems has been w...
research
03/02/2022

Finite-sample concentration of the empirical relative entropy around its mean

In this note, we show that the relative entropy of an empirical distribu...

Please sign up or login with your details

Forgot password? Click here to reset