Differentiable Clustering with Perturbed Spanning Forests

05/25/2023
by   Lawrence Stewart, et al.
0

We introduce a differentiable clustering method based on minimum-weight spanning forests, a variant of spanning trees with several connected components. Our method relies on stochastic perturbations of solutions of linear programs, for smoothing and efficient gradient computations. This allows us to include clustering in end-to-end trainable pipelines. We show that our method performs well even in difficult settings, such as datasets with high noise and challenging geometries. We also formulate an ad hoc loss to efficiently learn from partial clustering data using this operation. We demonstrate its performance on several real world datasets for supervised and semi-supervised tasks.

READ FULL TEXT
research
02/20/2020

Learning with Differentiable Perturbed Optimizers

Machine learning pipelines often rely on optimization procedures to make...
research
10/17/2019

Smoothing graph signals via random spanning forests

Another facet of the elegant link between random processes on graphs and...
research
12/20/2018

Reliable Agglomerative Clustering

We analyze the general behavior of agglomerative clustering methods, and...
research
05/28/2013

Matrices of forests, analysis of networks, and ranking problems

The matrices of spanning rooted forests are studied as a tool for analys...
research
11/06/2019

Probabilistic Watershed: Sampling all spanning forests for seeded segmentation and semi-supervised learning

The seeded Watershed algorithm / minimax semi-supervised learning on a g...
research
06/05/2023

End-to-end Differentiable Clustering with Associative Memories

Clustering is a widely used unsupervised learning technique involving an...
research
03/10/2023

Clustering with minimum spanning trees: How good can it be?

Minimum spanning trees (MSTs) provide a convenient representation of dat...

Please sign up or login with your details

Forgot password? Click here to reset