Joint Optimization of an Autoencoder for Clustering and Embedding

12/07/2020
by   Ahcène Boubekki, et al.
0

Incorporating k-means-like clustering techniques into (deep) autoencoders constitutes an interesting idea as the clustering may exploit the learned similarities in the embedding to compute a non-linear grouping of data at-hand. Unfortunately, the resulting contributions are often limited by ad-hoc choices, decoupled optimization problems and other issues. We present a theoretically-driven deep clustering approach that does not suffer from these limitations and allows for joint optimization of clustering and embedding. The network in its simplest form is derived from a Gaussian mixture model and can be incorporated seamlessly into deep autoencoders for state-of-the-art performance.

READ FULL TEXT

page 19

page 23

research
12/16/2018

Deep Clustering Based on a Mixture of Autoencoders

In this paper we propose a Deep Autoencoder MIxture Clustering (DAMIC) a...
research
08/23/2018

On a 'Two Truths' Phenomenon in Spectral Graph Clustering

Clustering is concerned with coherently grouping observations without an...
research
10/13/2020

Mixed data Deep Gaussian Mixture Model: A clustering model for mixed datasets

Clustering mixed data presents numerous challenges inherent to the very ...
research
11/08/2016

Deep Unsupervised Clustering with Gaussian Mixture Variational Autoencoders

We study a variant of the variational autoencoder model (VAE) with a Gau...
research
12/05/2022

Clustering with Neural Network and Index

A new model called Clustering with Neural Network and Index (CNNI) is in...
research
10/30/2020

Unsupervised Embedding of Hierarchical Structure in Euclidean Space

Deep embedding methods have influenced many areas of unsupervised learni...
research
11/18/2020

Vector Embeddings with Subvector Permutation Invariance using a Triplet Enhanced Autoencoder

The use of deep neural network (DNN) autoencoders (AEs) has recently exp...

Please sign up or login with your details

Forgot password? Click here to reset