Deep clustering with concrete k-means

10/17/2019
by   Boyan Gao, et al.
14

We address the problem of simultaneously learning a k-means clustering and deep feature representation from unlabelled data, which is of interest due to the potential of deep k-means to outperform traditional two-step feature extraction and shallow-clustering strategies. We achieve this by developing a gradient-estimator for the non-differentiable k-means objective via the Gumbel-Softmax reparameterisation trick. In contrast to previous attempts at deep clustering, our concrete k-means model can be optimised with respect to the canonical k-means objective and is easily trained end-to-end without resorting to alternating optimisation. We demonstrate the efficacy of our method on standard clustering benchmarks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/26/2018

Deep k-Means: Jointly Clustering with k-Means and Learning Representations

We study in this paper the problem of jointly clustering and learning re...
research
07/26/2020

Dimensionality Reduction for k-means Clustering

We present a study on how to effectively reduce the dimensions of the k-...
research
06/05/2023

End-to-end Differentiable Clustering with Associative Memories

Clustering is a widely used unsupervised learning technique involving an...
research
11/27/2021

Transformed K-means Clustering

In this work we propose a clustering framework based on the paradigm of ...
research
04/09/2015

Extraction of Protein Sequence Motif Information using PSO K-Means

The main objective of the paper is to find the motif information.The fun...
research
12/07/2022

On the Global Solution of Soft k-Means

This paper presents an algorithm to solve the Soft k-Means problem globa...
research
08/22/2018

k-meansNet: When k-means Meets Differentiable Programming

In this paper, we study how to make clustering benefiting from different...

Please sign up or login with your details

Forgot password? Click here to reset