DeepAI AI Chat
Log In Sign Up

Grassmann Manifold Flow

by   Ryoma Yataka, et al.

Recently, studies on machine learning have focused on methods that use symmetry implicit in a specific manifold as an inductive bias. In particular, approaches using Grassmann manifolds have been found to exhibit effective performance in fields such as point cloud and image set analysis. However, there is a lack of research on the construction of general learning models to learn distributions on the Grassmann manifold. In this paper, we lay the theoretical foundations for learning distributions on the Grassmann manifold via continuous normalizing flows. Experimental results show that the proposed method can generate high-quality samples by capturing the data structure. Further, the proposed method significantly outperformed state-of-the-art methods in terms of log-likelihood or evidence lower bound. The results obtained are expected to usher in further research in this field of study.


Equivariant Manifold Flows

Tractably modelling distributions over manifolds has long been an import...

ManiFlow: Implicitly Representing Manifolds with Normalizing Flows

Normalizing Flows (NFs) are flexible explicit generative models that hav...

Joint Manifold Learning and Density Estimation Using Normalizing Flows

Based on the manifold hypothesis, real-world data often lie on a low-dim...

Multi-task manifold learning for small sample size datasets

In this study, we develop a method for multi-task manifold learning. The...

Maximum Covariance Unfolding Regression: A Novel Covariate-based Manifold Learning Approach for Point Cloud Data

Point cloud data are widely used in manufacturing applications for proce...

AQuaMaM: An Autoregressive, Quaternion Manifold Model for Rapidly Estimating Complex SO(3) Distributions

Accurately modeling complex, multimodal distributions is necessary for o...

Machine Learning on generalized Complete Intersection Calabi-Yau Manifolds

Generalized Complete Intersection Calabi-Yau Manifold (gCICY) is a new c...