The information bottleneck and geometric clustering

12/27/2017
by   D J Strouse, et al.
0

The information bottleneck (IB) approach to clustering takes a joint distribution P(X,Y) and maps the data X to cluster labels T which retain maximal information about Y (Tishby et al., 1999). This objective results in an algorithm that clusters data points based upon the similarity of their conditional distributions P(Y| X). This is in contrast to classic "geometric clustering" algorithms such as k-means and gaussian mixture models (GMMs) which take a set of observed data points {x_i}_i=1:N and cluster them based upon their geometric (typically Euclidean) distance from one another. Here, we show how to use the deterministic information bottleneck (DIB) (Strouse and Schwab, 2017), a variant of IB, to perform geometric clustering, by choosing cluster labels that preserve information about data point location on a smoothed dataset. We also introduce a novel intuitive method to choose the number of clusters, via kinks in the information curve. We apply this approach to a variety of simple clustering problems, showing that DIB with our model selection procedure recovers the generative cluster labels. We also show that, for one simple case, DIB interpolates between the cluster boundaries of GMMs and k-means in the large data limit. Thus, our IB approach to clustering also provides an information-theoretic perspective on these classic algorithms.

READ FULL TEXT

page 5

page 8

research
02/16/2022

IPD:An Incremental Prototype based DBSCAN for large-scale data with cluster representatives

DBSCAN is a fundamental density-based clustering technique that identifi...
research
01/12/2017

Light Source Point Cluster Selection Based Atmosphere Light Estimation

Atmosphere light value is a highly critical parameter in defogging algor...
research
10/07/2019

Gaussian Mixture Clustering Using Relative Tests of Fit

We consider clustering based on significance tests for Gaussian Mixture ...
research
10/01/2018

Accelerated Training of Large-Scale Gaussian Mixtures by a Merger of Sublinear Approaches

We combine two recent lines of research on sublinear clustering to signi...
research
11/09/2017

Can clustering scale sublinearly with its clusters? A variational EM acceleration of GMMs and k-means

One iteration of k-means or EM for Gaussian mixture models (GMMs) scales...
research
03/22/2021

Forest Fire Clustering: Cluster-oriented Label Propagation Clustering and Monte Carlo Verification Inspired by Forest Fire Dynamics

Clustering methods group data points together and assign them group-leve...
research
10/29/2020

Attentive Clustering Processes

Amortized approaches to clustering have recently received renewed attent...

Please sign up or login with your details

Forgot password? Click here to reset