
Semiparametric Imputation Using Conditional Gaussian Mixture Models under Item Nonresponse
Imputation is a popular technique for handling item nonresponse in surve...
read it

Efficient EM Training of Gaussian Mixtures with Missing Data
In datamining applications, we are frequently faced with a large fracti...
read it

Reconstruction of sequential data with density models
We introduce the problem of reconstructing a sequence of multidimensiona...
read it

The UUtest for Statistical Modeling of Unimodal Data
Deciding on the unimodality of a dataset is an important problem in data...
read it

A Gaussian mixture model representation of endmember variability in hyperspectral unmixing
Hyperspectral unmixing while considering endmember variability is usuall...
read it

Missing Value Imputation Based on Deep Generative Models
Missing values widely exist in many realworld datasets, which hinders t...
read it

Structural Gaussian mixture vector autoregressive model
A structural version of the Gaussian mixture vector autoregressive model...
read it
Estimating conditional density of missing values using deep Gaussian mixture model
We consider the problem of estimating the conditional probability distribution of missing values given the observed ones. We propose an approach, which combines the flexibility of deep neural networks with the simplicity of Gaussian mixture models (GMMs). Given an incomplete data point, our neural network returns the parameters of Gaussian distribution (in the form of Factor Analyzers model) representing the corresponding conditional density. We experimentally verify that our model provides better loglikelihood than conditional GMM trained in a typical way. Moreover, imputation obtained by replacing missing values using the mean vector of our model looks visually plausible.
READ FULL TEXT
Comments
There are no comments yet.