On reproduction of On the regularization of Wasserstein GANs

12/16/2017
by   Junghoon Seo, et al.
0

This report has several purposes. First, our report is written to investigate the reproducibility of the submitted paper On the regularization of Wasserstein GANs (2018). Second, among the experiments performed in the submitted paper, five aspects were emphasized and reproduced: learning speed, stability, robustness against hyperparameter, estimating the Wasserstein distance, and various sampling method. Finally, we identify which parts of the contribution can be reproduced, and at what cost in terms of resources. All source code for reproduction is open to the public.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/02/2021

Wasserstein GANs Work Because They Fail (to Approximate the Wasserstein Distance)

Wasserstein GANs are based on the idea of minimising the Wasserstein dis...
research
11/29/2019

Orthogonal Wasserstein GANs

Wasserstein-GANs have been introduced to address the deficiencies of gen...
research
02/10/2019

(q,p)-Wasserstein GANs: Comparing Ground Metrics for Wasserstein GANs

Generative Adversial Networks (GANs) have made a major impact in compute...
research
11/02/2021

Understanding Entropic Regularization in GANs

Generative Adversarial Networks are a popular method for learning distri...
research
06/08/2020

Distributional Robustness with IPMs and links to Regularization and GANs

Robustness to adversarial attacks is an important concern due to the fra...
research
07/04/2023

Wasserstein medians: robustness, PDE characterization and numerics

We investigate the notion of Wasserstein median as an alternative to the...
research
10/07/2022

Adversarial network training using higher-order moments in a modified Wasserstein distance

Generative-adversarial networks (GANs) have been used to produce data cl...

Please sign up or login with your details

Forgot password? Click here to reset