On the symmetrical Kullback-Leibler Jeffreys centroids

03/29/2013
by   Frank Nielsen, et al.
0

Due to the success of the bag-of-word modeling paradigm, clustering histograms has become an important ingredient of modern information processing. Clustering histograms can be performed using the celebrated k-means centroid-based algorithm. From the viewpoint of applications, it is usually required to deal with symmetric distances. In this letter, we consider the Jeffreys divergence that symmetrizes the Kullback-Leibler divergence, and investigate the computation of Jeffreys centroids. We first prove that the Jeffreys centroid can be expressed analytically using the Lambert W function for positive histograms. We then show how to obtain a fast guaranteed approximation when dealing with frequency histograms. Finally, we conclude with some remarks on the k-means histogram clustering.

READ FULL TEXT
research
01/29/2021

On f-divergences between Cauchy distributions

We prove that the f-divergences between univariate Cauchy distributions ...
research
01/25/2017

Fast Exact k-Means, k-Medians and Bregman Divergence Clustering in 1D

The k-Means clustering problem on n points is NP-Hard for any dimension ...
research
04/24/2023

Unsupervised Machine Learning to Classify the Confinement of Waves in Periodic Superstructures

We employ unsupervised machine learning to enhance the accuracy of our r...
research
06/26/2018

Deep k-Means: Jointly Clustering with k-Means and Learning Representations

We study in this paper the problem of jointly clustering and learning re...
research
06/11/2020

Symmetric-Approximation Energy-Based Estimation of Distribution (SEED): A Continuous Optimization Algorithm

Estimation of Distribution Algorithms (EDAs) maintain and iteratively up...
research
11/27/2021

Transformed K-means Clustering

In this work we propose a clustering framework based on the paradigm of ...

Please sign up or login with your details

Forgot password? Click here to reset