Wasserstein-2 Generative Networks

09/28/2019
by   Alexander Korotin, et al.
0

Modern generative learning is mainly associated with Generative Adversarial Networks (GANs). Training such networks is always hard due to the minimax nature of the optimization objective. In this paper we propose a novel algorithm for training generative models, which gets rid of mini-max GAN objective, thus significantly simplified model training. The proposed algorithm uses the variational approximation of Wasserstein-2 distances by Input Convex Neural Networks. We also provide the results of computational experiments, which confirms the efficiency of our algorithm in application to latent spaces optimal transport and image-to-image style transfer.

READ FULL TEXT

page 5

page 6

research
06/24/2019

Adversarial Computation of Optimal Transport Maps

Computing optimal transport maps between high-dimensional and continuous...
research
10/16/2019

Optimal Transport Based Generative Autoencoders

The field of deep generative modeling is dominated by generative adversa...
research
02/22/2018

Solving Approximate Wasserstein GANs to Stationarity

Generative Adversarial Networks (GANs) are one of the most practical str...
research
06/17/2022

SOS: Score-based Oversampling for Tabular Data

Score-based generative models (SGMs) are a recent breakthrough in genera...
research
10/06/2021

Generative Modeling with Optimal Transport Maps

With the discovery of Wasserstein GANs, Optimal Transport (OT) has becom...
research
02/07/2022

Algorithms that get old : the case of generative algorithms

Generative IA networks, like the Variational Auto-Encoders (VAE), and Ge...
research
12/24/2019

Barycenters of Natural Images – Constrained Wasserstein Barycenters for Image Morphing

Image interpolation, or image morphing, refers to a visual transition be...

Please sign up or login with your details

Forgot password? Click here to reset