Sampling From Autoencoders' Latent Space via Quantization And Probability Mass Function Concepts

08/21/2023
by   Aymene Mohammed Bouayed, et al.
0

In this study, we focus on sampling from the latent space of generative models built upon autoencoders so as the reconstructed samples are lifelike images. To do to, we introduce a novel post-training sampling algorithm rooted in the concept of probability mass functions, coupled with a quantization process. Our proposed algorithm establishes a vicinity around each latent vector from the input data and then proceeds to draw samples from these defined neighborhoods. This strategic approach ensures that the sampled latent vectors predominantly inhabit high-probability regions, which, in turn, can be effectively transformed into authentic real-world images. A noteworthy point of comparison for our sampling algorithm is the sampling technique based on Gaussian mixture models (GMM), owing to its inherent capability to represent clusters. Remarkably, we manage to improve the time complexity from the previous 𝒪(n× d × k × i) associated with GMM sampling to a much more streamlined 𝒪(n× d), thereby resulting in substantial speedup during runtime. Moreover, our experimental results, gauged through the Fréchet inception distance (FID) for image generation, underscore the superior performance of our sampling algorithm across a diverse range of models and datasets. On the MNIST benchmark dataset, our approach outperforms GMM sampling by yielding a noteworthy improvement of up to 0.89 in FID value. Furthermore, when it comes to generating images of faces and ocular images, our approach showcases substantial enhancements with FID improvements of 1.69 and 0.87 respectively, as compared to GMM sampling, as evidenced on the CelebA and MOBIUS datasets. Lastly, we substantiate our methodology's efficacy in estimating latent space distributions in contrast to GMM sampling, particularly through the lens of the Wasserstein distance.

READ FULL TEXT

page 14

page 15

page 17

page 18

page 19

page 20

page 21

page 22

research
01/16/2023

Simplex Autoencoders

Synthetic data generation is increasingly important due to privacy conce...
research
09/18/2023

Learning Nonparametric High-Dimensional Generative Models: The Empirical-Beta-Copula Autoencoder

By sampling from the latent space of an autoencoder and decoding the lat...
research
10/02/2020

Encoded Prior Sliced Wasserstein AutoEncoder for learning latent manifold representations

While variational autoencoders have been successful generative models fo...
research
06/05/2018

On Latent Distributions Without Finite Mean in Generative Models

We investigate the properties of multidimensional probability distributi...
research
03/29/2019

From Variational to Deterministic Autoencoders

Variational Autoencoders (VAEs) provide a theoretically-backed framework...
research
11/23/2021

Smoothing the Generative Latent Space with Mixup-based Distance Learning

Producing diverse and realistic images with generative models such as GA...

Please sign up or login with your details

Forgot password? Click here to reset