Concentration Bounds for Discrete Distribution Estimation in KL Divergence

02/14/2023
by   Clément L. Canonne, et al.
0

We study the problem of discrete distribution estimation in KL divergence and provide concentration bounds for the Laplace estimator. We show that the deviation from mean scales as √(k)/n when n ≥ k, improving upon the best prior result of k/n. We also establish a matching lower bound that shows that our bounds are tight up to polylogarithmic factors.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/29/2019

A New Lower Bound for Kullback-Leibler Divergence Based on Hammersley-Chapman-Robbins Bound

In this paper, we derive a useful lower bound for the Kullback-Leibler d...
research
06/29/2013

Concentration and Confidence for Discrete Bayesian Sequence Predictors

Bayesian sequence prediction is a simple technique for predicting future...
research
04/04/2019

A deterministic and computable Bernstein-von Mises theorem

Bernstein-von Mises results (BvM) establish that the Laplace approximati...
research
04/26/2018

Tight MMSE Bounds for the AGN Channel Under KL Divergence Constraints on the Input Distribution

Tight bounds on the minimum mean square error for the additive Gaussian ...
research
04/04/2019

Concentration of the multinomial in Kullback-Leibler divergence near the ratio of alphabet and sample sizes

We bound the moment generating function of the Kullback-Leibler divergen...
research
03/15/2012

Inference by Minimizing Size, Divergence, or their Sum

We speed up marginal inference by ignoring factors that do not significa...

Please sign up or login with your details

Forgot password? Click here to reset