SeGMA: Semi-Supervised Gaussian Mixture Auto-Encoder

06/21/2019
by   Marek Śmieja, et al.
8

We propose a semi-supervised generative model, SeGMA, which learns a joint probability distribution of data and their classes and which is implemented in a typical Wasserstein auto-encoder framework. We choose a mixture of Gaussians as a target distribution in latent space, which provides a natural splitting of data into clusters. To connect Gaussian components with correct classes, we use a small amount of labeled data and a Gaussian classifier induced by the target distribution. SeGMA is optimized efficiently due to the use of Cramer-Wold distance as a maximum mean discrepancy penalty, which yields a closed-form expression for a mixture of spherical Gaussian components and thus obviates the need of sampling. While SeGMA preserves all properties of its semi-supervised predecessors and achieves at least as good generative performance on standard benchmark data sets, it presents additional features: (a) interpolation between any pair of points in the latent space produces realistically-looking samples; (b) combining the interpolation property with disentangled class and style variables, SeGMA is able to perform a continuous style transfer from one class to another; (c) it is possible to change the intensity of class characteristics in a data point by moving the latent representation of the data point away from specific Gaussian components.

READ FULL TEXT

page 1

page 2

page 6

page 7

page 8

page 9

research
04/17/2018

DGPose: Disentangled Semi-supervised Deep Generative Models for Human Body Analysis

Deep generative modelling for robust human body analysis is an emerging ...
research
09/25/2019

Semi-supervised Text Style Transfer: Cross Projection in Latent Space

Text style transfer task requires the model to transfer a sentence of on...
research
02/26/2020

Performance Analysis of Semi-supervised Learning in the Small-data Regime using VAEs

Extracting large amounts of data from biological samples is not feasible...
research
03/22/2023

Semi-supervised counterfactual explanations

Counterfactual explanations for machine learning models are used to find...
research
12/10/2021

Guided Generative Models using Weak Supervision for Detecting Object Spatial Arrangement in Overhead Images

The increasing availability and accessibility of numerous overhead image...
research
11/10/2019

Stylized Text Generation Using Wasserstein Autoencoders with a Mixture of Gaussian Prior

Wasserstein autoencoders are effective for text generation. They do not ...
research
04/01/2022

Fashion Style Generation: Evolutionary Search with Gaussian Mixture Models in the Latent Space

This paper presents a novel approach for guiding a Generative Adversaria...

Please sign up or login with your details

Forgot password? Click here to reset