Resampling Base Distributions of Normalizing Flows

10/29/2021
by   Vincent Stimper, et al.
1

Normalizing flows are a popular class of models for approximating probability distributions. However, their invertible nature limits their ability to model target distributions with a complex topological structure, such as Boltzmann distributions. Several procedures have been proposed to solve this problem but many of them sacrifice invertibility and, thereby, tractability of the log-likelihood as well as other desirable properties. To address these limitations, we introduce a base distribution for normalizing flows based on learned rejection sampling, allowing the resulting normalizing flow to model complex topologies without giving up bijectivity. Furthermore, we develop suitable learning algorithms using both maximizing the log-likelihood and the optimization of the reverse Kullback-Leibler divergence, and apply them to various sample problems, i.e. approximating 2D densities, density estimation of tabular data, image generation, and modeling Boltzmann distributions. In these experiments our method is competitive with or outperforms the baselines.

READ FULL TEXT

page 2

page 5

page 14

page 15

page 19

research
11/04/2022

Flows for Flows: Training Normalizing Flows Between Arbitrary Distributions with Maximum Likelihood Estimation

Normalizing flows are constructed from a base distribution with a known ...
research
05/04/2023

Piecewise Normalizing Flows

Normalizing flows are an established approach for modelling complex prob...
research
11/29/2019

Learning Likelihoods with Conditional Normalizing Flows

Normalizing Flows (NFs) are able to model complicated distributions p(y)...
research
09/08/2023

Variations and Relaxations of Normalizing Flows

Normalizing Flows (NFs) describe a class of models that express a comple...
research
03/26/2020

Log-Likelihood Ratio Minimizing Flows: Towards Robust and Quantifiable Neural Distribution Alignment

Unsupervised distribution alignment has many applications in deep learni...
research
01/13/2023

Designing losses for data-free training of normalizing flows on Boltzmann distributions

Generating a Boltzmann distribution in high dimension has recently been ...
research
10/27/2021

Evidential Softmax for Sparse Multimodal Distributions in Deep Generative Models

Many applications of generative models rely on the marginalization of th...

Please sign up or login with your details

Forgot password? Click here to reset