DeepAI AI Chat
Log In Sign Up

Wasserstein-2 Generative Networks

09/28/2019
by   Alexander Korotin, et al.
0

Modern generative learning is mainly associated with Generative Adversarial Networks (GANs). Training such networks is always hard due to the minimax nature of the optimization objective. In this paper we propose a novel algorithm for training generative models, which gets rid of mini-max GAN objective, thus significantly simplified model training. The proposed algorithm uses the variational approximation of Wasserstein-2 distances by Input Convex Neural Networks. We also provide the results of computational experiments, which confirms the efficiency of our algorithm in application to latent spaces optimal transport and image-to-image style transfer.

READ FULL TEXT

page 5

page 6

06/24/2019

Adversarial Computation of Optimal Transport Maps

Computing optimal transport maps between high-dimensional and continuous...
10/16/2019

Optimal Transport Based Generative Autoencoders

The field of deep generative modeling is dominated by generative adversa...
02/22/2018

Solving Approximate Wasserstein GANs to Stationarity

Generative Adversarial Networks (GANs) are one of the most practical str...
06/17/2022

SOS: Score-based Oversampling for Tabular Data

Score-based generative models (SGMs) are a recent breakthrough in genera...
02/10/2019

(q,p)-Wasserstein GANs: Comparing Ground Metrics for Wasserstein GANs

Generative Adversial Networks (GANs) have made a major impact in compute...
10/06/2021

Generative Modeling with Optimal Transport Maps

With the discovery of Wasserstein GANs, Optimal Transport (OT) has becom...
02/07/2022

Algorithms that get old : the case of generative algorithms

Generative IA networks, like the Variational Auto-Encoders (VAE), and Ge...