Some Theoretical Insights into Wasserstein GANs

06/04/2020
by   Gérard Biau, et al.
0

Generative Adversarial Networks (GANs) have been successful in producing outstanding results in areas as diverse as image, video, and text generation. Building on these successes, a large number of empirical studies have validated the benefits of the cousin approach called Wasserstein GANs (WGANs), which brings stabilization in the training process. In the present paper, we add a new stone to the edifice by proposing some theoretical advances in the properties of WGANs. First, we properly define the architecture of WGANs in the context of integral probability metrics parameterized by neural networks and highlight some of their basic mathematical features. We stress in particular interesting optimization properties arising from the use of a parametric 1-Lipschitz discriminator. Then, in a statistically-driven approach, we study the convergence of empirical WGANs as the sample size tends to infinity, and clarify the adversarial effects of the generator and the discrimi-nator by underlining some trade-off properties. These features are finally illustrated with experiments using both synthetic and real-world datasets.

READ FULL TEXT

page 16

page 22

page 23

page 24

research
03/09/2020

When can Wasserstein GANs minimize Wasserstein Distance?

Generative Adversarial Networks (GANs) are widely used models to learn c...
research
08/19/2020

Direct Adversarial Training for GANs

There is an interesting discovery that several neural networks are vulne...
research
01/08/2022

Optimal 1-Wasserstein Distance for WGANs

The mathematical forces at work behind Generative Adversarial Networks r...
research
11/29/2019

Orthogonal Wasserstein GANs

Wasserstein-GANs have been introduced to address the deficiencies of gen...
research
11/24/2017

Wasserstein Introspective Neural Networks

We present Wasserstein introspective neural networks (WINN) that are bot...
research
06/03/2020

Approximation and convergence of GANs training: an SDE approach

Generative adversarial networks (GANs) have enjoyed tremendous empirical...
research
03/02/2020

Subadditivity of Probability Divergences on Bayes-Nets with Applications to Time Series GANs

GANs for time series data often use sliding windows or self-attention to...

Please sign up or login with your details

Forgot password? Click here to reset