Deep Amortized Clustering

by   Juho Lee, et al.

We propose a deep amortized clustering (DAC), a neural architecture which learns to cluster datasets efficiently using a few forward passes. DAC implicitly learns what makes a cluster, how to group data points into clusters, and how to count the number of clusters in datasets. DAC is meta-learned using labelled datasets for training, a process distinct from traditional clustering algorithms which usually require hand-specified prior knowledge about cluster shapes/structures. We empirically show, on both synthetic and image data, that DAC can efficiently and accurately cluster new datasets coming from the same distribution used to generate training datasets.


page 12

page 18

page 19


Very Compact Clusters with Structural Regularization via Similarity and Connectivity

Clustering algorithms have significantly improved along with Deep Neural...

From Clustering to Cluster Explanations via Neural Networks

A wealth of algorithms have been developed to extract natural cluster st...

Attentive Clustering Processes

Amortized approaches to clustering have recently received renewed attent...

Multilayer Adjusted Cluster Point Process Model: Application to Microbial Biofilm Image Data Analysis

A common problem in spatial statistics tackles spatial distributions of ...

Meta-Learning to Cluster

Clustering is one of the most fundamental and wide-spread techniques in ...

Merged-GHCIDR: Geometrical Approach to Reduce Image Data

The computational resources required to train a model have been increasi...

Scale Adaptive Clustering of Multiple Structures

We propose the segmentation of noisy datasets into Multiple Inlier Struc...