Flow-based Generative Models for Learning Manifold to Manifold Mappings

12/18/2020
by   Xingjian Zhen, et al.
17

Many measurements or observations in computer vision and machine learning manifest as non-Euclidean data. While recent proposals (like spherical CNN) have extended a number of deep neural network architectures to manifold-valued data, and this has often provided strong improvements in performance, the literature on generative models for manifold data is quite sparse. Partly due to this gap, there are also no modality transfer/translation models for manifold-valued data whereas numerous such methods based on generative models are available for natural images. This paper addresses this gap, motivated by a need in brain imaging – in doing so, we expand the operating range of certain generative models (as well as generative models for modality transfer) from natural images to images with manifold-valued measurements. Our main result is the design of a two-stream version of GLOW (flow-based invertible generative models) that can synthesize information of a field of one type of manifold-valued measurements given another. On the theoretical side, we introduce three kinds of invertible layers for manifold-valued data, which are not only analogous to their functionality in flow-based generative models (e.g., GLOW) but also preserve the key benefits (determinants of the Jacobian are easy to calculate). For experiments, on a large dataset from the Human Connectome Project (HCP), we show promising results where we can reliably and accurately reconstruct brain images of a field of orientation distribution functions (ODF) from diffusion tensor images (DTI), where the latter has a 5× faster acquisition time but at the expense of worse angular resolution.

READ FULL TEXT

page 4

page 5

page 6

page 7

page 8

research
03/30/2020

ManifoldNorm: Extending normalizations on Riemannian Manifolds

Many measurements in computer vision and machine learning manifest as no...
research
09/07/2019

On Need for Topology Awareness of Generative Models

Manifold assumption in learning states that: the data lie approximately ...
research
05/27/2019

Universality Theorems for Generative Models

Despite the fact that generative models are extremely successful in prac...
research
09/09/2023

AmbientFlow: Invertible generative models from incomplete, noisy measurements

Generative models have gained popularity for their potential application...
research
10/14/2022

Commutativity and Disentanglement from the Manifold Perspective

In this paper, we interpret disentanglement from the manifold perspectiv...
research
12/01/2021

Forward Operator Estimation in Generative Models with Kernel Transfer Operators

Generative models which use explicit density modeling (e.g., variational...
research
10/05/2019

Dilated Convolutional Neural Networks for Sequential Manifold-valued Data

Efforts are underway to study ways via which the power of deep neural ne...

Please sign up or login with your details

Forgot password? Click here to reset