Wasserstein-Wasserstein Auto-Encoders

02/25/2019
by   Shunkang Zhang, et al.
0

To address the challenges in learning deep generative models (e.g.,the blurriness of variational auto-encoder and the instability of training generative adversarial networks, we propose a novel deep generative model, named Wasserstein-Wasserstein auto-encoders (WWAE). We formulate WWAE as minimization of the penalized optimal transport between the target distribution and the generated distribution. By noticing that both the prior P_Z and the aggregated posterior Q_Z of the latent code Z can be well captured by Gaussians, the proposed WWAE utilizes the closed-form of the squared Wasserstein-2 distance for two Gaussians in the optimization process. As a result, WWAE does not suffer from the sampling burden and it is computationally efficient by leveraging the reparameterization trick. Numerical results evaluated on multiple benchmark datasets including MNIST, fashion- MNIST and CelebA show that WWAE learns better latent structures than VAEs and generates samples of better visual quality and higher FID scores than VAEs and GANs.

READ FULL TEXT

page 7

page 8

page 10

page 11

page 12

research
11/05/2017

Wasserstein Auto-Encoders

We propose the Wasserstein Auto-Encoder (WAE)---a new algorithm for buil...
research
05/22/2017

From optimal transport to generative modeling: the VEGAN cookbook

We study unsupervised generative modeling in terms of the optimal transp...
research
07/11/2022

Bottlenecks CLUB: Unifying Information-Theoretic Trade-offs Among Complexity, Leakage, and Utility

Bottleneck problems are an important class of optimization problems that...
research
04/01/2022

DAG-WGAN: Causal Structure Learning With Wasserstein Generative Adversarial Networks

The combinatorial search space presents a significant challenge to learn...
research
06/03/2022

Causality Learning With Wasserstein Generative Adversarial Networks

Conventional methods for causal structure learning from data face signif...
research
09/30/2019

Towards Diverse Paraphrase Generation Using Multi-Class Wasserstein GAN

Paraphrase generation is an important and challenging natural language p...
research
05/28/2018

Deep Generative Models for Distribution-Preserving Lossy Compression

We propose and study the problem of distribution-preserving lossy compre...

Please sign up or login with your details

Forgot password? Click here to reset