Ordering Dimensions with Nested Dropout Normalizing Flows

06/15/2020
by   Artur Bekasov, et al.
0

The latent space of normalizing flows must be of the same dimensionality as their output space. This constraint presents a problem if we want to learn low-dimensional, semantically meaningful representations. Recent work has provided compact representations by fitting flows constrained to manifolds, but hasn't defined a density off that manifold. In this work we consider flows with full support in data space, but with ordered latent variables. Like in PCA, the leading latent dimensions define a sequence of manifolds that lie close to the data. We note a trade-off between the flow likelihood and the quality of the ordering, depending on the parameterization of the flow.

READ FULL TEXT
research
02/14/2022

Principal Manifold Flows

Normalizing flows map an independent set of latent variables to their sa...
research
05/18/2023

Information-Ordered Bottlenecks for Adaptive Semantic Compression

We present the information-ordered bottleneck (IOB), a neural layer desi...
research
06/23/2020

Normalizing Flows Across Dimensions

Real-world data with underlying structure, such as pictures of faces, ar...
research
07/12/2018

Explorations in Homeomorphic Variational Auto-Encoding

The manifold hypothesis states that many kinds of high-dimensional data ...
research
02/05/2014

Learning Ordered Representations with Nested Dropout

In this paper, we study ordered representations of data in which differe...
research
10/07/2019

Increasing Expressivity of a Hyperspherical VAE

Learning suitable latent representations for observed, high-dimensional ...
research
06/19/2020

An Ode to an ODE

We present a new paradigm for Neural ODE algorithms, calledODEtoODE, whe...

Please sign up or login with your details

Forgot password? Click here to reset