Variational Mixture of Normalizing Flows

In the past few years, deep generative models, such as generative adversarial networks GAN, variational autoencoders vaepaper, and their variants, have seen wide adoption for the task of modelling complex data distributions. In spite of the outstanding sample quality achieved by those early methods, they model the target distributions implicitly, in the sense that the probability density functions induced by them are not explicitly accessible. This fact renders those methods unfit for tasks that require, for example, scoring new instances of data with the learned distributions. Normalizing flows have overcome this limitation by leveraging the change-of-variables formula for probability density functions, and by using transformations designed to have tractable and cheaply computable Jacobians. Although flexible, this framework lacked (until recently semisuplearning_nflows, RAD) a way to introduce discrete structure (such as the one found in mixtures) in the models it allows to construct, in an unsupervised scenario. The present work overcomes this by using normalizing flows as components in a mixture model and devising an end-to-end training procedure for such a model. This procedure is based on variational inference, and uses a variational posterior parameterized by a neural network. As will become clear, this model naturally lends itself to (multimodal) density estimation, semi-supervised learning, and clustering. The proposed model is illustrated on two synthetic datasets, as well as on a real-world dataset. Keywords: Deep generative models, normalizing flows, variational inference, probabilistic modelling, mixture models.

READ FULL TEXT

page 5

page 11

page 13

research
02/05/2019

Meta-Amortized Variational Inference and Learning

How can we learn to do probabilistic inference in a way that generalizes...
research
12/18/2018

A Factorial Mixture Prior for Compositional Deep Generative Models

We assume that a high-dimensional datum, like an image, is a composition...
research
04/04/2022

Discretely Indexed Flows

In this paper we propose Discretely Indexed flows (DIF) as a new tool fo...
research
01/16/2023

Mixture Modeling with Normalizing Flows for Spherical Density Estimation

Normalizing flows are objects used for modeling complicated probability ...
research
02/27/2017

Variational Inference using Implicit Distributions

Generative adversarial networks (GANs) have given us a great tool to fit...
research
09/04/2017

Continuous-Time Flows for Deep Generative Models

Normalizing flows have been developed recently as a method for drawing s...
research
02/27/2020

Gradient Boosted Flows

Normalizing flows (NF) are a powerful framework for approximating poster...

Please sign up or login with your details

Forgot password? Click here to reset