Primal-Dual Wasserstein GAN

05/24/2018
by   Mevlana Gemici, et al.
2

We introduce Primal-Dual Wasserstein GAN, a new learning algorithm for building latent variable models of the data distribution based on the primal and the dual formulations of the optimal transport (OT) problem. We utilize the primal formulation to learn a flexible inference mechanism and to create an optimal approximate coupling between the data distribution and the generative model. In order to learn the generative model, we use the dual formulation and train the decoder adversarially through a critic network that is regularized by the approximate coupling obtained from the primal. Unlike previous methods that violate various properties of the optimal critic, we regularize the norm and the direction of the gradients of the critic function. Our model shares many of the desirable properties of auto-encoding models in terms of mode coverage and latent structure, while avoiding their undesirable averaging properties, e.g. their inability to capture sharp visual features when modeling real images. We compare our algorithm with several other generative modeling techniques that utilize Wasserstein distances on Frechet Inception Distance (FID) and Inception Scores (IS).

READ FULL TEXT

page 8

page 14

research
05/22/2017

From optimal transport to generative modeling: the VEGAN cookbook

We study unsupervised generative modeling in terms of the optimal transp...
research
04/05/2018

Sliced-Wasserstein Autoencoder: An Embarrassingly Simple Generative Model

In this paper we study generative modeling via autoencoders while using ...
research
08/28/2020

Continuous Regularized Wasserstein Barycenters

Wasserstein barycenters provide a geometrically meaningful way to aggreg...
research
07/14/2021

Differential-Critic GAN: Generating What You Want by a Cue of Preferences

This paper proposes Differential-Critic Generative Adversarial Network (...
research
10/30/2017

Implicit Manifold Learning on Generative Adversarial Networks

This paper raises an implicit manifold learning perspective in Generativ...
research
06/18/2018

The Information Autoencoding Family: A Lagrangian Perspective on Latent Variable Generative Models

A variety of learning objectives have been proposed for training latent ...
research
05/30/2018

Regularized Kernel and Neural Sobolev Descent: Dynamic MMD Transport

We introduce Regularized Kernel and Neural Sobolev Descent for transport...

Please sign up or login with your details

Forgot password? Click here to reset