Guaranteed Deterministic Bounds on the Total Variation Distance between Univariate Mixtures

06/29/2018
by   Frank Nielsen, et al.
0

The total variation distance is a core statistical distance between probability measures that satisfies the metric axioms, with value always falling in [0,1]. This distance plays a fundamental role in machine learning and signal processing: It is a member of the broader class of f-divergences, and it is related to the probability of error in Bayesian hypothesis testing. Since the total variation distance does not admit closed-form expressions for statistical mixtures (like Gaussian mixture models), one often has to rely in practice on costly numerical integrations or on fast Monte Carlo approximations that however do not guarantee deterministic lower and upper bounds. In this work, we consider two methods for bounding the total variation of univariate mixture models: The first method is based on the information monotonicity property of the total variation to design guaranteed nested deterministic lower bounds. The second method relies on computing the geometric lower and upper envelopes of weighted mixture components to derive deterministic bounds based on density ratio. We demonstrate the tightness of our bounds in a series of experiments on Gaussian, Gamma and Rayleigh mixture models.

READ FULL TEXT
research
09/02/2021

Lower Bounds on the Total Variation Distance Between Mixtures of Two Gaussians

Mixtures of high dimensional Gaussian distributions have been studied ex...
research
06/19/2016

Guaranteed bounds on the Kullback-Leibler divergence of univariate mixtures using piecewise log-sum-exp inequalities

Information-theoretic measures such as the entropy, cross-entropy and th...
research
12/09/2016

A series of maximum entropy upper bounds of the differential entropy

We present a series of closed-form maximum entropy upper bounds for the ...
research
07/13/2021

Fast approximations of the Jeffreys divergence between univariate Gaussian mixture models via exponential polynomial densities

The Jeffreys divergence is a renown symmetrization of the statistical Ku...
research
02/01/2023

The Parametric Stability of Well-separated Spherical Gaussian Mixtures

We quantify the parameter stability of a spherical Gaussian Mixture Mode...
research
05/12/2020

High Probability Lower Bounds for the Total Variation Distance

The statistics and machine learning communities have recently seen a gro...
research
05/08/2019

Bounding distributional errors via density ratios

We present some new and explicit error bounds for the approximation of d...

Please sign up or login with your details

Forgot password? Click here to reset