Expected path length on random manifolds

08/20/2019
by   David Eklund, et al.
0

Manifold learning seeks a low dimensional representation that faithfully captures the essence of data. Current methods can successfully learn such representations, but do not provide a meaningful set of operations that are associated with the representation. Working towards operational representation learning, we endow the latent space of a large class of generative models with a random Riemannian metric, which provides us with elementary operators. As computational tools are unavailable for random Riemannian manifolds, we study deterministic approximations and derive tight error bounds on expected distances.

READ FULL TEXT
research
12/20/2022

Identifying latent distances with Finslerian geometry

Riemannian geometry provides powerful tools to explore the latent space ...
research
11/21/2017

The Riemannian Geometry of Deep Generative Models

Deep generative models learn a mapping from a low dimensional latent spa...
research
04/03/2023

VTAE: Variational Transformer Autoencoder with Manifolds Learning

Deep generative models have demonstrated successful applications in lear...
research
05/17/2023

Learning Pose Image Manifolds Using Geometry-Preserving GANs and Elasticae

This paper investigates the challenge of learning image manifolds, speci...
research
02/06/2022

Riemannian Score-Based Generative Modeling

Score-based generative models (SGMs) are a novel class of generative mod...
research
08/02/2020

Geometrically Enriched Latent Spaces

A common assumption in generative models is that the generator immerses ...
research
03/07/2016

Elastic Functional Coding of Riemannian Trajectories

Visual observations of dynamic phenomena, such as human actions, are oft...

Please sign up or login with your details

Forgot password? Click here to reset