Data Dieting in GAN Training

04/07/2020
by   Jamal Toutouh, et al.
0

We investigate training Generative Adversarial Networks, GANs, with less data. Subsets of the training dataset can express empirical sample diversity while reducing training resource requirements, e.g. time and memory. We ask how much data reduction impacts generator performance and gauge the additive value of generator ensembles. In addition to considering stand-alone GAN training and ensembles of generator models, we also consider reduced data training on an evolutionary GAN training framework named Redux-Lipizzaner. Redux-Lipizzaner makes GAN training more robust and accurate by exploiting overlapping neighborhood-based training on a spatial 2D grid. We conduct empirical experiments on Redux-Lipizzaner using the MNIST and CelebA data sets.

READ FULL TEXT

page 14

page 15

page 18

research
06/25/2021

Fostering Diversity in Spatial Evolutionary Generative Adversarial Networks

Generative adversary networks (GANs) suffer from training pathologies su...
research
05/29/2019

Spatial Evolutionary Generative Adversarial Networks

Generative adversary networks (GANs) suffer from training pathologies su...
research
11/12/2019

Few-Features Attack to Fool Machine Learning Models through Mask-Based GAN

GAN is a deep-learning based generative approach to generate contents su...
research
12/03/2016

Ensembles of Generative Adversarial Networks

Ensembles are a popular way to improve results of discriminative CNNs. T...
research
08/03/2020

Analyzing the Components of Distributed Coevolutionary GAN Training

Distributed coevolutionary Generative Adversarial Network (GAN) training...
research
10/02/2018

On Self Modulation for Generative Adversarial Networks

Training Generative Adversarial Networks (GANs) is notoriously challengi...
research
03/30/2020

Re-purposing Heterogeneous Generative Ensembles with Evolutionary Computation

Generative Adversarial Networks (GANs) are popular tools for generative ...

Please sign up or login with your details

Forgot password? Click here to reset