DeepAI AI Chat
Log In Sign Up

On Latent Distributions Without Finite Mean in Generative Models

06/05/2018
by   Damian Leśniak, et al.
0

We investigate the properties of multidimensional probability distributions in the context of latent space prior distributions of implicit generative models. Our work revolves around the phenomena arising while decoding linear interpolations between two random latent vectors -- regions of latent space in close proximity to the origin of the space are sampled causing distribution mismatch. We show that due to the Central Limit Theorem, this region is almost never sampled during the training process. As a result, linear interpolations may generate unrealistic data and their usage as a tool to check quality of the trained model is questionable. We propose to use multidimensional Cauchy distribution as the latent prior. Cauchy distribution does not satisfy the assumptions of the CLT and has a number of properties that allow it to work well in conjunction with linear interpolations. We also provide two general methods of creating non-linear interpolations that are easily applicable to a large family of common latent distributions. Finally we empirically analyze the quality of data generated from low-probability-mass regions for the DCGAN model on the CelebA dataset.

READ FULL TEXT

page 10

page 11

page 16

page 17

page 18

page 19

page 20

page 21

10/31/2017

Semantic Interpolation in Implicit Models

In implicit models, one often interpolates between sampled points in lat...
11/06/2017

Optimal transport maps for distribution preserving operations on latent spaces of Generative Models

Generative models such as Variational Auto Encoders (VAEs) and Generativ...
06/09/2021

Pulling back information geometry

Latent space geometry has shown itself to provide a rich and rigorous fr...
10/28/2020

GENs: Generative Encoding Networks

Mapping data from and/or onto a known family of distributions has become...
12/31/2020

Why do classifier accuracies show linear trends under distribution shift?

Several recent studies observed that when classification models are eval...
11/29/2019

Transflow Learning: Repurposing Flow Models Without Retraining

It is well known that deep generative models have a rich latent space, a...
01/25/2022

Rayleigh EigenDirections (REDs): GAN latent space traversals for multidimensional features

We present a method for finding paths in a deep generative model's laten...