The statistical Minkowski distances: Closed-form formula for Gaussian Mixture Models

01/09/2019
by   Frank Nielsen, et al.
6

The traditional Minkowski distances are induced by the corresponding Minkowski norms in real-valued vector spaces. In this work, we propose novel statistical symmetric distances based on the Minkowski's inequality for probability densities belonging to Lebesgue spaces. These statistical Minkowski distances admit closed-form formula for Gaussian mixture models when parameterized by integer exponents: Namely, we prove that these distances between mixtures are obtained from multinomial expansions, and written by means of weighted sums of inverse exponentials of generalized Jensen diversity indices of the mixture component distributions. This result extends to arbitrary mixtures of exponential families with natural parameter spaces being cones: This includes the binomial, the multinomial, the zero-centered Laplacian, the Gaussian and the Wishart mixtures, among others. We also derive a Minkowski's diversity index of a normalized weighted set of probability distributions from Minkowski's inequality.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/13/2021

Fast approximations of the Jeffreys divergence between univariate Gaussian mixture models via exponential polynomial densities

The Jeffreys divergence is a renown symmetrization of the statistical Ku...
research
04/26/2021

Consistency issues in Gaussian Mixture Models reduction algorithms

In many contexts Gaussian Mixtures (GM) are used to approximate probabil...
research
07/08/2022

Revisiting Chernoff Information with Likelihood Ratio Exponential Families

The Chernoff information between two probability measures is a statistic...
research
01/14/2017

On Hölder projective divergences

We describe a framework to build distances by measuring the tightness of...
research
02/09/2023

Mathematical Model of Quantum Channel Capacity

In this article, we are proposing a closed-form solution for the capacit...
research
05/27/2020

Fast Risk Assessment for Autonomous Vehicles Using Learned Models of Agent Futures

This paper presents fast non-sampling based methods to assess the risk o...
research
06/19/2016

Guaranteed bounds on the Kullback-Leibler divergence of univariate mixtures using piecewise log-sum-exp inequalities

Information-theoretic measures such as the entropy, cross-entropy and th...

Please sign up or login with your details

Forgot password? Click here to reset