Stable Parallel Training of Wasserstein Conditional Generative Adversarial Neural Networks

07/25/2022
by   Massimiliano Lupo Pasini, et al.
0

We propose a stable, parallel approach to train Wasserstein Conditional Generative Adversarial Neural Networks (W-CGANs) under the constraint of a fixed computational budget. Differently from previous distributed GANs training techniques, our approach avoids inter-process communications, reduces the risk of mode collapse and enhances scalability by using multiple generators, each one of them concurrently trained on a single data label. The use of the Wasserstein metric also reduces the risk of cycling by stabilizing the training of each generator. We illustrate the approach on the CIFAR10, CIFAR100, and ImageNet1k datasets, three standard benchmark image datasets, maintaining the original resolution of the images for each dataset. Performance is assessed in terms of scalability and final accuracy within a limited fixed computational time and computational resources. To measure accuracy, we use the inception score, the Frechet inception distance, and image quality. An improvement in inception score and Frechet inception distance is shown in comparison to previous results obtained by performing the parallel approach on deep convolutional conditional generative adversarial neural networks (DC-CGANs) as well as an improvement of image quality of the new images created by the GANs approach. Weak scaling is attained on both datasets using up to 2,000 NVIDIA V100 GPUs on the OLCF supercomputer Summit.

READ FULL TEXT

page 11

page 12

page 13

page 14

page 15

page 16

research
02/21/2021

Scalable Balanced Training of Conditional Generative Adversarial Neural Networks on Image Data

We propose a distributed approach to train deep convolutional generative...
research
03/06/2019

DepthwiseGANs: Fast Training Generative Adversarial Networks for Realistic Image Synthesis

Recent work has shown significant progress in the direction of synthetic...
research
11/20/2020

Complexity Controlled Generative Adversarial Networks

One of the issues faced in training Generative Adversarial Nets (GANs) a...
research
10/08/2021

Evaluating generative networks using Gaussian mixtures of image features

We develop a measure for evaluating the performance of generative networ...
research
06/11/2021

ViT-Inception-GAN for Image Colourising

Studies involving colourising images has been garnering researchers' kee...
research
02/23/2018

Is Generator Conditioning Causally Related to GAN Performance?

Recent work (Pennington et al, 2017) suggests that controlling the entir...
research
11/01/2021

Projected GANs Converge Faster

Generative Adversarial Networks (GANs) produce high-quality images but a...

Please sign up or login with your details

Forgot password? Click here to reset