Sobolev Wasserstein GAN

12/07/2020
by   Minkai Xu, et al.
10

Wasserstein GANs (WGANs), built upon the Kantorovich-Rubinstein (KR) duality of Wasserstein distance, is one of the most theoretically sound GAN models. However, in practice it does not always outperform other variants of GANs. This is mostly due to the imperfect implementation of the Lipschitz condition required by the KR duality. Extensive work has been done in the community with different implementations of the Lipschitz constraint, which, however, is still hard to satisfy the restriction perfectly in practice. In this paper, we argue that the strong Lipschitz constraint might be unnecessary for optimization. Instead, we take a step back and try to relax the Lipschitz constraint. Theoretically, we first demonstrate a more general dual form of the Wasserstein distance called the Sobolev duality, which relaxes the Lipschitz constraint but still maintains the favorable gradient property of the Wasserstein distance. Moreover, we show that the KR duality is actually a special case of the Sobolev duality. Based on the relaxed duality, we further propose a generalized WGAN training scheme named Sobolev Wasserstein GAN (SWGAN), and empirically demonstrate the improvement of SWGAN over existing methods with extensive experiments.

READ FULL TEXT

page 8

page 21

page 22

page 23

07/02/2018

Understanding the Effectiveness of Lipschitz Constraint in Training of GANs via Gradient Analysis

This paper aims to bring a new perspective for understanding GANs, by de...
02/15/2019

Lipschitz Generative Adversarial Nets

In this paper we study the convergence of generative adversarial network...
11/29/2019

Orthogonal Wasserstein GANs

Wasserstein-GANs have been introduced to address the deficiencies of gen...
04/30/2022

A Simple Duality Proof for Wasserstein Distributionally Robust Optimization

We present a short and elementary proof of the duality for Wasserstein d...
04/11/2021

The Many Faces of 1-Lipschitz Neural Networks

Lipschitz constrained models have been used to solve specifics deep lear...
11/18/2018

GAN-QP: A Novel GAN Framework without Gradient Vanishing and Lipschitz Constraint

We know SGAN may have a risk of gradient vanishing. A significant improv...
02/26/2021

Moreau-Yosida f-divergences

Variational representations of f-divergences are central to many machine...

Code Repositories