Missing Features Reconstruction Using a Wasserstein Generative Adversarial Imputation Network

06/21/2020
by   Magda Friedjungová, et al.
0

Missing data is one of the most common preprocessing problems. In this paper, we experimentally research the use of generative and non-generative models for feature reconstruction. Variational Autoencoder with Arbitrary Conditioning (VAEAC) and Generative Adversarial Imputation Network (GAIN) were researched as representatives of generative models, while the denoising autoencoder (DAE) represented non-generative models. Performance of the models is compared to traditional methods k-nearest neighbors (k-NN) and Multiple Imputation by Chained Equations (MICE). Moreover, we introduce WGAIN as the Wasserstein modification of GAIN, which turns out to be the best imputation model when the degree of missingness is less than or equal to 30 on real-world and artificial datasets with continuous features where different percentages of features, varying from 10 algorithms was done by measuring the accuracy of the classification model previously trained on the uncorrupted dataset. The results show that GAIN and especially WGAIN are the best imputers regardless of the conditions. In general, they outperform or are comparative to MICE, k-NN, DAE, and VAEAC.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/09/2019

Missing Features Reconstruction and Its Impact on Classification Accuracy

In real-world applications, we can encounter situations when a well-trai...
research
02/27/2019

Improving Missing Data Imputation with Deep Generative Models

Datasets with missing values are very common on industry applications, a...
research
02/06/2023

ClueGAIN: Application of Transfer Learning On Generative Adversarial Imputation Nets (GAIN)

Many studies have attempted to solve the problem of missing data using v...
research
11/12/2017

Medical Diagnosis From Laboratory Tests by Combining Generative and Discriminative Learning

A primary goal of computational phenotype research is to conduct medical...
research
06/10/2015

Data Generation as Sequential Decision Making

We connect a broad class of generative models through their shared relia...
research
08/21/2023

Feature Extraction Using Deep Generative Models for Bangla Text Classification on a New Comprehensive Dataset

The selection of features for text classification is a fundamental task ...
research
09/13/2019

Flow Models for Arbitrary Conditional Likelihoods

Understanding the dependencies among features of a dataset is at the cor...

Please sign up or login with your details

Forgot password? Click here to reset