Learning Nonparametric High-Dimensional Generative Models: The Empirical-Beta-Copula Autoencoder

09/18/2023
by   Maximilian Coblenz, et al.
0

By sampling from the latent space of an autoencoder and decoding the latent space samples to the original data space, any autoencoder can simply be turned into a generative model. For this to work, it is necessary to model the autoencoder's latent space with a distribution from which samples can be obtained. Several simple possibilities (kernel density estimates, Gaussian distribution) and more sophisticated ones (Gaussian mixture models, copula models, normalization flows) can be thought of and have been tried recently. This study aims to discuss, assess, and compare various techniques that can be used to capture the latent space so that an autoencoder can become a generative model while striving for simplicity. Among them, a new copula-based method, the Empirical Beta Copula Autoencoder, is considered. Furthermore, we provide insights into further aspects of these methods, such as targeted sampling or synthesizing new data with specific features.

READ FULL TEXT

page 7

page 9

page 10

page 19

page 22

research
06/12/2019

Copulas as High-Dimensional Generative Models: Vine Copula Autoencoders

We propose a vine copula autoencoder to construct flexible generative mo...
research
05/23/2018

Cramer-Wold AutoEncoder

We propose a new generative model, Cramer-Wold Autoencoder (CWAE). Follo...
research
10/10/2019

Rate-Distortion Optimization Guided Autoencoder for Generative Approach with quantitatively measurable latent space

In the generative model approach of machine learning, it is essential to...
research
02/04/2019

A Forest from the Trees: Generation through Neighborhoods

In this work, we propose to learn a generative model using both learned ...
research
05/27/2021

Classification and Uncertainty Quantification of Corrupted Data using Semi-Supervised Autoencoders

Parametric and non-parametric classifiers often have to deal with real-w...
research
06/17/2021

Learning Perceptual Manifold of Fonts

Along the rapid development of deep learning techniques in generative mo...
research
08/21/2023

Sampling From Autoencoders' Latent Space via Quantization And Probability Mass Function Concepts

In this study, we focus on sampling from the latent space of generative ...

Please sign up or login with your details

Forgot password? Click here to reset