Entropy and the Discrete Central Limit Theorem

06/01/2021
by   Lampros Gavalakis, et al.
0

A strengthened version of the central limit theorem for discrete random variables is established, relying only on information-theoretic tools and elementary arguments. It is shown that the relative entropy between the standardised sum of n independent and identically distributed lattice random variables and an appropriately discretised Gaussian, vanishes as n→∞.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/28/2019

A conditional Berry-Esseen inequality

As an extension of a central limit theorem established by Svante Janson,...
research
03/23/2020

A Central Limit Theorem for incomplete U-statistics over triangular arrays

We analyze the fluctuations of incomplete U-statistics over a triangular...
research
11/07/2022

Information Properties of a Random Variable Decomposition through Lattices

A full-rank lattice in the Euclidean space is a discrete set formed by a...
research
08/03/2020

On the Maximum Entropy of a Sum of Independent Discrete Random Variables

Let X_1, …, X_n be independent random variables taking values in the alp...
research
01/25/2019

An essay on copula modelling for discrete random vectors; or how to pour new wine into old bottles

Copulas have now become ubiquitous statistical tools for describing, ana...
research
06/19/2019

Central limit theorem for a partially observed interacting system of Hawkes processes

We observe the actions of a K sub-sample of N individuals up to time t f...
research
05/28/2019

Maximal correlation and the rate of Fisher information convergence in the Central Limit Theorem

We consider the behaviour of the Fisher information of scaled sums of in...

Please sign up or login with your details

Forgot password? Click here to reset