Bounds on the Information Divergence for Hypergeometric Distributions

02/07/2020
by   Peter Harremoes, et al.
0

The hypergeometric distributions have many important applications, but they have not had sufficient attention in information theory. Hypergeometric distributions can be approximated by binomial distributions or Poisson distributions. In this paper we present upper and lower bounds on information divergence. These bounds are important for statistical testing and a better understanding of the notion of exchange-ability.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/27/2021

Tight Lower Bounds for α-Divergences Under Moment Constraints and Relations Between Different α

The α-divergences include the well-known Kullback-Leibler divergence, He...
research
05/08/2019

Bounding distributional errors via density ratios

We present some new and explicit error bounds for the approximation of d...
research
09/02/2020

Properties of f-divergences and f-GAN training

In this technical report we describe some properties of f-divergences an...
research
03/12/2022

Bitcoin's Latency–Security Analysis Made Simple

Closed-form upper and lower bounds are developed for the security of the...
research
10/18/2022

On Relations Between Tight Bounds for Symmetric f-Divergences and Binary Divergences

Minimizing divergence measures under a constraint is an important proble...
research
09/30/2021

On the Kullback-Leibler divergence between discrete normal distributions

Discrete normal distributions are defined as the distributions with pres...
research
12/19/2014

Empirically Estimable Classification Bounds Based on a New Divergence Measure

Information divergence functions play a critical role in statistics and ...

Please sign up or login with your details

Forgot password? Click here to reset