A Tale of Two Latent Flows: Learning Latent Space Normalizing Flow with Short-run Langevin Flow for Approximate Inference

01/23/2023
by   Jianwen Xie, et al.
0

We study a normalizing flow in the latent space of a top-down generator model, in which the normalizing flow model plays the role of the informative prior model of the generator. We propose to jointly learn the latent space normalizing flow prior model and the top-down generator model by a Markov chain Monte Carlo (MCMC)-based maximum likelihood algorithm, where a short-run Langevin sampling from the intractable posterior distribution is performed to infer the latent variables for each observed example, so that the parameters of the normalizing flow prior and the generator can be updated with the inferred latent variables. We show that, under the scenario of non-convergent short-run MCMC, the finite step Langevin dynamics is a flow-like approximate inference model and the learning objective actually follows the perturbation of the maximum likelihood estimation (MLE). We further point out that the learning framework seeks to (i) match the latent space normalizing flow and the aggregated posterior produced by the short-run Langevin flow, and (ii) bias the model from MLE such that the short-run Langevin flow inference is close to the true posterior. Empirical results of extensive experiments validate the effectiveness of the proposed latent space normalizing flow model in the tasks of image generation, image reconstruction, anomaly detection, supervised image inpainting and unsupervised image recovery.

READ FULL TEXT

page 6

page 7

page 9

page 14

page 15

page 17

research
06/15/2020

Learning Latent Space Energy-Based Prior Model

The generator model assumes that the observed example is generated by a ...
research
12/04/2019

Learning Deep Generative Models with Short Run Inference Dynamics

This paper studies the fundamental problem of learning deep generative m...
research
04/16/2023

Likelihood-Based Generative Radiance Field with Latent Space Energy-Based Model for 3D-Aware Disentangled Image Representation

We propose the NeRF-LEBM, a likelihood-based top-down 3D-aware 2D image ...
research
09/19/2022

Adaptive Multi-stage Density Ratio Estimation for Learning Latent Space Energy-based Model

This paper studies the fundamental problem of learning energy-based mode...
research
09/20/2022

Learning Sparse Latent Representations for Generator Model

Sparsity is a desirable attribute. It can lead to more efficient and mor...
research
03/15/2022

LiP-Flow: Learning Inference-time Priors for Codec Avatars via Normalizing Flows in Latent Space

Neural face avatars that are trained from multi-view data captured in ca...
research
11/06/2017

Consistency of Maximum Likelihood for Continuous-Space Network Models

Network analysis needs tools to infer distributions over graphs of arbit...

Please sign up or login with your details

Forgot password? Click here to reset