Latent Space Non-Linear Statistics

05/19/2018
by   Line Kuhnel, et al.
0

Given data, deep generative models, such as variational autoencoders (VAE) and generative adversarial networks (GAN), train a lower dimensional latent representation of the data space. The linear Euclidean geometry of data space pulls back to a nonlinear Riemannian geometry on the latent space. The latent space thus provides a low-dimensional nonlinear representation of data and classical linear statistical techniques are no longer applicable. In this paper we show how statistics of data in their latent space representation can be performed using techniques from the field of nonlinear manifold statistics. Nonlinear manifold statistics provide generalizations of Euclidean statistical notions including means, principal component analysis, and maximum likelihood fits of parametric probability distributions. We develop new techniques for maximum likelihood inference in latent space, and adress the computational complexity of using geometric algorithms with high-dimensional data by training a separate neural network to approximate the Riemannian metric and cometric tensor capturing the shape of the learned data manifold.

READ FULL TEXT

page 7

page 8

research
11/21/2017

The Riemannian Geometry of Deep Generative Models

Deep generative models learn a mapping from a low dimensional latent spa...
research
02/12/2020

Variational Autoencoders with Riemannian Brownian Motion Priors

Variational Autoencoders (VAEs) represent the given data in a low-dimens...
research
08/23/2023

Semiparametric Modeling and Analysis for Longitudinal Network Data

We introduce a semiparametric latent space model for analyzing longitudi...
research
02/26/2020

Max-Affine Spline Insights into Deep Generative Networks

We connect a large class of Generative Deep Networks (GDNs) with spline ...
research
05/26/2018

Geometric Understanding of Deep Learning

Deep learning is the mainstream technique for many machine learning task...
research
08/27/2021

Multimodal Data Fusion in High-Dimensional Heterogeneous Datasets via Generative Models

The commonly used latent space embedding techniques, such as Principal C...
research
03/09/2021

A prior-based approximate latent Riemannian metric

Stochastic generative models enable us to capture the geometric structur...

Please sign up or login with your details

Forgot password? Click here to reset