From Clustering to Cluster Explanations via Neural Networks

06/18/2019
by   Jacob Kauffmann, et al.
6

A wealth of algorithms have been developed to extract natural cluster structure in data. Identifying this structure is desirable but not always sufficient: We may also want to understand why the data points have been assigned to a given cluster. Clustering algorithms do not offer a systematic answer to this simple question. Hence we propose a new framework that can, for the first time, explain cluster assignments in terms of input features in a comprehensive manner. It is based on the novel theoretical insight that clustering models can be rewritten as neural networks, or 'neuralized'. Predictions of the obtained networks can then be quickly and accurately attributed to the input features. Several showcases demonstrate the ability of our method to assess the quality of learned clusters and to extract novel insights from the analyzed data and representations.

READ FULL TEXT
09/30/2019

Deep Amortized Clustering

We propose a deep amortized clustering (DAC), a neural architecture whic...
11/03/2019

Clustering in Partially Labeled Stochastic Block Models via Total Variation Minimization

A main task in data analysis is to organize data points into coherent gr...
07/18/2022

Simplifying Clustering with Graph Neural Networks

The objective functions used in spectral clustering are usually composed...
10/03/2021

Information Elicitation Meets Clustering

In the setting where we want to aggregate people's subjective evaluation...
06/27/2012

Agglomerative Bregman Clustering

This manuscript develops the theory of agglomerative clustering with Bre...
12/08/2020

Algorithms for finding k in k-means

k-means Clustering requires as input the exact value of k, the number of...
01/20/2020

Deep Image Clustering with Tensor Kernels and Unsupervised Companion Objectives

In this paper we develop a new model for deep image clustering, using co...