Training Deep Normalizing Flow Models in Highly Incomplete Data Scenarios with Prior Regularization

04/03/2021
by   Edgar A. Bernal, et al.
0

Deep generative frameworks including GANs and normalizing flow models have proven successful at filling in missing values in partially observed data samples by effectively learning – either explicitly or implicitly – complex, high-dimensional statistical distributions. In tasks where the data available for learning is only partially observed, however, their performance decays monotonically as a function of the data missingness rate. In high missing data rate regimes (e.g., 60 models tend to break down and produce unrealistic and/or semantically inaccurate data. We propose a novel framework to facilitate the learning of data distributions in high paucity scenarios that is inspired by traditional formulations of solutions to ill-posed problems. The proposed framework naturally stems from posing the process of learning from incomplete data as a joint optimization task of the parameters of the model being learned and the missing data values. The method involves enforcing a prior regularization term that seamlessly integrates with objectives used to train explicit and tractable deep generative frameworks such as deep normalizing flow models. We demonstrate via extensive experimental validation that the proposed framework outperforms competing techniques, particularly as the rate of data paucity approaches unity.

READ FULL TEXT

page 7

page 8

research
02/25/2019

MisGAN: Learning from Incomplete Data with Generative Adversarial Networks

Generative adversarial networks (GANs) have been shown to provide an eff...
research
03/05/2021

Deep Generative Pattern-Set Mixture Models for Nonignorable Missingness

We propose a variational autoencoder architecture to model both ignorabl...
research
08/05/2018

Missing Value Imputation Based on Deep Generative Models

Missing values widely exist in many real-world datasets, which hinders t...
research
01/16/2022

Reconstruction of Incomplete Wildfire Data using Deep Generative Models

We present our submission to the Extreme Value Analysis 2021 Data Challe...
research
06/23/2020

not-MIWAE: Deep Generative Modelling with Missing not at Random Data

When a missing process depends on the missing values themselves, it need...
research
09/09/2023

AmbientFlow: Invertible generative models from incomplete, noisy measurements

Generative models have gained popularity for their potential application...
research
12/01/2021

Learning Invariant Representations with Missing Data

Spurious correlations allow flexible models to predict well during train...

Please sign up or login with your details

Forgot password? Click here to reset