Universal Conditional Machine

06/06/2018
by   Oleg Ivanov, et al.
4

We propose a single neural probabilistic model based on variational autoencoder that can be conditioned on an arbitrary subset of observed features and then sample the remaining features in "one shot". The features may be both real-valued and categorical. Training of the model is performed by stochastic variational Bayes. The experimental evaluation on synthetic data, as well as feature imputation and image inpainting problems, shows the effectiveness of the proposed approach and diversity of the generated samples.

READ FULL TEXT

page 8

page 11

page 13

research
04/16/2020

Conditioned Variational Autoencoder for top-N item recommendation

In this paper, we propose a Conditioned Variational Autoencoder to impro...
research
09/13/2019

Flow Models for Arbitrary Conditional Likelihoods

Understanding the dependencies among features of a dataset is at the cor...
research
06/15/2020

Robust Variational Autoencoder for Tabular Data with Beta Divergence

We propose a robust variational autoencoder with β divergence for tabula...
research
06/28/2021

Dizygotic Conditional Variational AutoEncoder for Multi-Modal and Partial Modality Absent Few-Shot Learning

Data augmentation is a powerful technique for improving the performance ...
research
04/29/2023

Analyzing drop coalescence in microfluidic device with a deep learning generative model

Predicting drop coalescence based on process parameters is crucial for e...
research
06/30/2022

TTS-by-TTS 2: Data-selective augmentation for neural speech synthesis using ranking support vector machine with variational autoencoder

Recent advances in synthetic speech quality have enabled us to train tex...

Please sign up or login with your details

Forgot password? Click here to reset