Improving k-Means Clustering Performance with Disentangled Internal Representations

06/05/2020
by   Abien Fred Agarap, et al.
0

Deep clustering algorithms combine representation learning and clustering by jointly optimizing a clustering loss and a non-clustering loss. In such methods, a deep neural network is used for representation learning together with a clustering network. Instead of following this framework to improve clustering performance, we propose a simpler approach of optimizing the entanglement of the learned latent code representation of an autoencoder. We define entanglement as how close pairs of points from the same class or structure are, relative to pairs of points from different classes or structures. To measure the entanglement of data points, we use the soft nearest neighbor loss, and expand it by introducing an annealing temperature factor. Using our proposed approach, the test clustering accuracy was 96.2 MNIST dataset, 85.6 Balanced dataset, outperforming our baseline models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/05/2019

Analyzing and Improving Representations with the Soft Nearest Neighbor Loss

We explore and expand the Soft Nearest Neighbor Loss to measure the enta...
research
07/23/2021

Text Classification and Clustering with Annealing Soft Nearest Neighbor Loss

We define disentanglement as how far class-different data points from ea...
research
11/03/2020

Mixing Consistent Deep Clustering

Finding well-defined clusters in data represents a fundamental challenge...
research
09/21/2020

Deep Clustering and Representation Learning that Preserves Geometric Structures

In this paper, we propose a novel framework for Deep Clustering and Repr...
research
08/16/2019

N2D:(Not Too) Deep clustering via clustering the local manifold of an autoencoded embedding

Deep clustering has increasingly been demonstrating superiority over con...
research
12/15/2020

Deep Fusion Clustering Network

Deep clustering is a fundamental yet challenging task for data analysis....
research
04/15/2022

Perfectly Balanced: Improving Transfer and Robustness of Supervised Contrastive Learning

An ideal learned representation should display transferability and robus...

Please sign up or login with your details

Forgot password? Click here to reset