Pulling back information geometry

06/09/2021
by   Georgios Arvanitidis, et al.
0

Latent space geometry has shown itself to provide a rich and rigorous framework for interacting with the latent variables of deep generative models. The existing theory, however, relies on the decoder being a Gaussian distribution as its simple reparametrization allows us to interpret the generating process as a random projection of a deterministic manifold. Consequently, this approach breaks down when applied to decoders that are not as easily reparametrized. We here propose to use the Fisher-Rao metric associated with the space of decoder distributions as a reference metric, which we pull back to the latent space. We show that we can achieve meaningful latent geometries for a wide range of decoder distributions for which the previous theory was not applicable, opening the door to `black box' latent geometries.

READ FULL TEXT

page 6

page 7

page 8

page 9

research
10/31/2017

Latent Space Oddity: on the Curvature of Deep Generative Models

Deep generative models provide a systematic way to learn nonlinear data ...
research
12/20/2022

Identifying latent distances with Finslerian geometry

Riemannian geometry provides powerful tools to explore the latent space ...
research
10/22/2020

Geometry-Aware Hamiltonian Variational Auto-Encoder

Variational auto-encoders (VAEs) have proven to be a well suited tool fo...
research
06/11/2023

Happy People – Image Synthesis as Black-Box Optimization Problem in the Discrete Latent Space of Deep Generative Models

In recent years, optimization in the learned latent space of deep genera...
research
05/31/2022

Mario Plays on a Manifold: Generating Functional Content in Latent Space through Differential Geometry

Deep generative models can automatically create content of diverse types...
research
06/05/2018

On Latent Distributions Without Finite Mean in Generative Models

We investigate the properties of multidimensional probability distributi...
research
03/28/2019

Toroidal AutoEncoder

Enforcing distributions of latent variables in neural networks is an act...

Please sign up or login with your details

Forgot password? Click here to reset