Mixture-of-Experts Variational Autoencoder for clustering and generating from similarity-based representations

10/17/2019
by   Andreas Kopf, et al.
35

Clustering high-dimensional data, such as images or biological measurements, is a long-standing problem and has been studied extensively. Recently, Deep Clustering gained popularity due to the non-linearity of neural networks, which allows for flexibility in fitting the specific peculiarities of complex data. Here we introduce the Mixture-of-Experts Similarity Variational Autoencoder (MoE-Sim-VAE), a novel generative clustering model. The model can learn multi-modal distributions of high-dimensional data and use these to generate realistic data with high efficacy and efficiency. MoE-Sim-VAE is based on a Variational Autoencoder (VAE), where the decoder consists of a Mixture-of-Experts (MoE) architecture. This specific architecture allows for various modes of the data to be automatically learned by means of the experts. Additionally, we encourage the latent representation of our model to follow a Gaussian mixture distribution and to accurately represent the similarities between the data points. We assess the performance of our model on synthetic data, the MNIST benchmark data set, and a challenging real-world task of defining cell subpopulations from mass cytometry (CyTOF) measurements on hundreds of different datasets. MoE-Sim-VAE exhibits superior clustering performance on all these tasks in comparison to the baselines and we show that the MoE architecture in the decoder reduces the computational cost of sampling specific data modes with high fidelity.

READ FULL TEXT

page 6

page 12

page 14

page 15

page 16

research
11/08/2016

Deep Unsupervised Clustering with Gaussian Mixture Variational Autoencoders

We study a variant of the variational autoencoder model (VAE) with a Gau...
research
02/11/2019

Variational Autoencoder with Truncated Mixture of Gaussians for Functional Connectivity Analysis

Resting-state functional connectivity states are often identified as clu...
research
07/09/2021

Lifelong Mixture of Variational Autoencoders

In this paper, we propose an end-to-end lifelong learning mixture of exp...
research
06/13/2020

High-Dimensional Similarity Search with Quantum-Assisted Variational Autoencoder

Recent progress in quantum algorithms and hardware indicates the potenti...
research
07/11/2022

Hierarchical Latent Structure for Multi-Modal Vehicle Trajectory Forecasting

Variational autoencoder (VAE) has widely been utilized for modeling data...
research
09/12/2019

Generating Data using Monte Carlo Dropout

For many analytical problems the challenge is to handle huge amounts of ...
research
06/19/2023

A VAE Approach to Sample Multivariate Extremes

Generating accurate extremes from an observational data set is crucial w...

Please sign up or login with your details

Forgot password? Click here to reset