Deep Clustering with Measure Propagation

04/18/2021
by   Minhua Chen, et al.
11

Deep models have improved state-of-the-art for both supervised and unsupervised learning. For example, deep embedded clustering (DEC) has greatly improved the unsupervised clustering performance, by using stacked autoencoders for representation learning. However, one weakness of deep modeling is that the local neighborhood structure in the original space is not necessarily preserved in the latent space. To preserve local geometry, various methods have been proposed in the supervised and semi-supervised learning literature (e.g., spectral clustering and label propagation) using graph Laplacian regularization. In this paper, we combine the strength of deep representation learning with measure propagation (MP), a KL-divergence based graph regularization method originally used in the semi-supervised scenario. The main assumption of MP is that if two data points are close in the original space, they are likely to belong to the same class, measured by KL-divergence of class membership distribution. By taking the same assumption in the unsupervised learning scenario, we propose our Deep Embedded Clustering Aided by Measure Propagation (DECAMP) model. We evaluate DECAMP on short text clustering tasks. On three public datasets, DECAMP performs competitively with other state-of-the-art baselines, including baselines using additional data to generate word embeddings used in the clustering process. As an example, on the Stackoverflow dataset, DECAMP achieved a clustering accuracy of 79 about 5 that DECAMP is a very effective method for unsupervised learning.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

05/31/2013

Privileged Information for Data Clustering

Many machine learning algorithms assume that all input samples are indep...
05/28/2019

Local Label Propagation for Large-Scale Semi-Supervised Learning

A significant issue in training deep neural networks to solve supervised...
06/07/2018

Semi-Supervised Learning via Compact Latent Space Clustering

We present a novel cost function for semi-supervised learning of neural ...
06/13/2020

Consistent Semi-Supervised Graph Regularization for High Dimensional Data

Semi-supervised Laplacian regularization, a standard graph-based approac...
08/13/2018

A Transfer Learning based Feature-Weak-Relevant Method for Image Clustering

Image clustering is to group a set of images into disjoint clusters in a...
01/30/2019

Deep Archetypal Analysis

"Deep Archetypal Analysis" generates latent representations of high-dime...
10/21/2019

Icentia11K: An Unsupervised Representation Learning Dataset for Arrhythmia Subtype Discovery

We release the largest public ECG dataset of continuous raw signals for ...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.