A maximum value for the Kullback-Leibler divergence between quantum discrete distributions

08/13/2020
by   Vincenzo Bonnici, et al.
0

This work presents an upper-bound for the maximum value that the Kullback-Leibler (KL) divergence from a given discrete probability distribution P can reach. In particular, the aim is to find a discrete distribution Q which maximizes the KL divergence from a given P under the assumption that P and Q have been generated by distributing a fixed discretized quantity. In addition, infinite divergences are avoided. The theoretical findings are used for proposing a notion of normalized KL divergence that is empirically shown to behave differently from already known measures.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/10/2021

On the Properties of Kullback-Leibler Divergence Between Gaussians

Kullback-Leibler (KL) divergence is one of the most important divergence...
research
02/28/2022

KL Divergence Estimation with Multi-group Attribution

Estimating the Kullback-Leibler (KL) divergence between two distribution...
research
05/25/2017

Convergence of Langevin MCMC in KL-divergence

Langevin diffusion is a commonly used tool for sampling from a given dis...
research
11/19/2019

On the Upper Bound of the Kullback-Leibler Divergence and Cross Entropy

This archiving article consists of several short reports on the discussi...
research
05/29/2023

The First and Second Order Asymptotics of Covert Communication over AWGN Channels

This paper investigates the asymptotics of the maximal throughput of com...
research
03/24/2022

Kullback-Leibler control for discrete-time nonlinear systems on continuous spaces

Kullback-Leibler (KL) control enables efficient numerical methods for no...
research
06/29/2013

Concentration and Confidence for Discrete Bayesian Sequence Predictors

Bayesian sequence prediction is a simple technique for predicting future...

Please sign up or login with your details

Forgot password? Click here to reset