Moser Flow: Divergence-based Generative Modeling on Manifolds

08/18/2021
by   Noam Rozen, et al.
0

We are interested in learning generative models for complex geometries described via manifolds, such as spheres, tori, and other implicit surfaces. Current extensions of existing (Euclidean) generative models are restricted to specific geometries and typically suffer from high computational costs. We introduce Moser Flow (MF), a new class of generative models within the family of continuous normalizing flows (CNF). MF also produces a CNF via a solution to the change-of-variable formula, however differently from other CNF methods, its model (learned) density is parameterized as the source (prior) density minus the divergence of a neural network (NN). The divergence is a local, linear differential operator, easy to approximate and calculate on manifolds. Therefore, unlike other CNFs, MF does not require invoking or backpropagating through an ODE solver during training. Furthermore, representing the model density explicitly as the divergence of a NN rather than as a solution of an ODE facilitates learning high fidelity densities. Theoretically, we prove that MF constitutes a universal density approximator under suitable assumptions. Empirically, we demonstrate for the first time the use of flow models for sampling from general curved surfaces and achieve significant improvements in density estimation, sample quality, and training complexity over existing CNFs on challenging synthetic geometries and real-world benchmarks from the earth and climate sciences.

READ FULL TEXT

page 7

page 8

page 9

research
07/11/2022

Matching Normalizing Flows and Probability Paths on Manifolds

Continuous Normalizing Flows (CNFs) are a class of generative models tha...
research
02/12/2019

MaCow: Masked Convolutional Generative Flow

Flow-based generative models, conceptually attractive due to tractabilit...
research
01/08/2020

Learning Generative Models using Denoising Density Estimators

Learning generative probabilistic models that can estimate the continuou...
research
02/07/2023

Riemannian Flow Matching on General Geometries

We propose Riemannian Flow Matching (RFM), a simple yet powerful framewo...
research
02/17/2020

Augmented Normalizing Flows: Bridging the Gap Between Generative Flows and Latent Variable Models

In this work, we propose a new family of generative flows on an augmente...
research
04/11/2018

CoT: Cooperative Training for Generative Modeling

We propose Cooperative Training (CoT) for training generative models tha...
research
12/01/2021

Forward Operator Estimation in Generative Models with Kernel Transfer Operators

Generative models which use explicit density modeling (e.g., variational...

Please sign up or login with your details

Forgot password? Click here to reset