A short note on an inequality between KL and TV

02/15/2022
by   Clément L. Canonne, et al.
0

The goal of this short note is to discuss the relation between Kullback–Leibler divergence and total variation distance, starting with the celebrated Pinsker's inequality relating the two, before switching to a simple, yet (arguably) more useful inequality, apparently not as well known, due to Bretagnolle and Huber. We also discuss applications of this bound for minimax testing lower bounds.

READ FULL TEXT
research
03/01/2022

Tight bounds for augmented KL divergence in terms of augmented total variation distance

We provide optimal variational upper and lower bounds for the augmented ...
research
07/14/2018

About the lower bounds for the multiple testing problem

Given an observed random variable, consider the problem of recovering it...
research
01/09/2018

Better and Simpler Error Analysis of the Sinkhorn-Knopp Algorithm for Matrix Scaling

Given a non-negative n × m real matrix A, the matrix scaling problem is...
research
01/26/2023

A Bound for Stieltjes Constants

The goal of this note is to improve on the currently available bounds fo...
research
12/15/2020

Squirrel: A Switching Hyperparameter Optimizer

In this short note, we describe our submission to the NeurIPS 2020 BBO c...
research
07/05/2018

Frame-constrained Total Variation Regularization for White Noise Regression

Despite the popularity and practical success of total variation (TV) reg...
research
08/28/2018

Exponential inequality for chaos based on sampling without replacement

We are interested in the behavior of particular functionals, in a framew...

Please sign up or login with your details

Forgot password? Click here to reset