Dimensionality Reduction Flows

08/05/2019
by   Hari Prasanna Das, et al.
0

Deep generative modelling using flows has gained popularity owing to the tractable exact log-likelihood estimation with efficient training and synthesis process. Trained flow models carry rich information about the structure and local variance in input data. However, a bottleneck for flow models to scale with increasing dimensions is that the latent space has same size as the high-dimensional input space. In this paper, we propose methods to reduce the latent space dimension of flow models. Our first approach includes replacing standard high dimensional prior with a learned prior from a low dimensional noise space. Further improving to achieve exact log-likelihood with reduced dimensionality, our second approach presents an improved multi-scale architecture (Dinh et al., 2016) via likelihood contribution based factorization of dimensions. Using our method over state-of-the-art flow models, we demonstrate improvements in log-likelihood score on standard image benchmarks. Our work ventures a data dependent factorization scheme which is more efficient than static counterparts in prior works.

READ FULL TEXT

page 7

page 8

research
07/09/2018

Glow: Generative Flow with Invertible 1x1 Convolutions

Flow-based generative models (Dinh et al., 2014) are conceptually attrac...
research
12/15/2021

Funnels: Exact maximum likelihood with dimensionality reduction

Normalizing flows are diffeomorphic, typically dimension-preserving, mod...
research
06/07/2023

Multiscale Flow for Robust and Optimal Cosmological Analysis

We propose Multiscale Flow, a generative Normalizing Flow that creates s...
research
12/09/2019

InfoCNF: An Efficient Conditional Continuous Normalizing Flow with Adaptive Solvers

Continuous Normalizing Flows (CNFs) have emerged as promising deep gener...
research
10/07/2020

Gradient-based Causal Structure Learning with Normalizing Flow

In this paper, we propose a score-based normalizing flow method called D...
research
12/08/2020

Out-Of-Distribution Detection With Subspace Techniques And Probabilistic Modeling Of Features

This paper presents a principled approach for detecting out-of-distribut...
research
09/06/2019

Set Flow: A Permutation Invariant Normalizing Flow

We present a generative model that is defined on finite sets of exchange...

Please sign up or login with your details

Forgot password? Click here to reset