DeepAI AI Chat
Log In Sign Up

Latent Space Oddity: on the Curvature of Deep Generative Models

by   Georgios Arvanitidis, et al.

Deep generative models provide a systematic way to learn nonlinear data distributions, through a set of latent variables and a nonlinear "generator" function that maps latent points into the input space. The nonlinearity of the generator imply that the latent space gives a distorted view of the input space. Under mild conditions, we show that this distortion can be characterized by a stochastic Riemannian metric, and demonstrate that distances and interpolants are significantly improved under this metric. This in turn improves probability distributions, sampling algorithms and clustering in the latent space. Our geometric analysis further reveals that current generators provide poor variance estimates and we propose a new generator architecture with vastly improved variance estimates. Results are demonstrated on convolutional and fully connected variational autoencoders, but the formalism easily generalize to other deep generative models.


page 8

page 9

page 10


Geometry of Deep Generative Models for Disentangled Representations

Deep generative models like variational autoencoders approximate the int...

Pulling back information geometry

Latent space geometry has shown itself to provide a rich and rigorous fr...

Out-domain examples for generative models

Deep generative models are being increasingly used in a wide variety of ...

Generator Reversal

We consider the problem of training generative models with deep neural n...

Learning Generative Prior with Latent Space Sparsity Constraints

We address the problem of compressed sensing using a deep generative pri...

Concept Formation and Dynamics of Repeated Inference in Deep Generative Models

Deep generative models are reported to be useful in broad applications i...