General Invertible Transformations for Flow-based Generative Modeling

11/30/2020
by   Jakub M. Tomczak, et al.
0

In this paper, we present a new class of invertible transformations. We indicate that many well-known invertible tranformations in reversible logic and reversible neural networks could be derived from our proposition. Next, we propose two new coupling layers that are important building blocks of flow-based generative models. In the preliminary experiments on toy digit data, we present how these new coupling layers could be used in Integer Discrete Flows (IDF), and that they achieve better results than standard coupling layers used in IDF and RealNVP.

READ FULL TEXT
research
05/17/2019

Integer Discrete Flows and Lossless Compression

Lossless compression methods shorten the expected representation size of...
research
10/16/2021

Equivariant Discrete Normalizing Flows

At its core, generative modeling seeks to uncover the underlying factors...
research
04/08/2020

Normalizing Flows with Multi-Scale Autoregressive Priors

Flow-based generative models are an important class of exact inference m...
research
01/15/2020

Invertible Generative Modeling using Linear Rational Splines

Normalizing flows attempt to model an arbitrary probability distribution...
research
07/18/2019

MintNet: Building Invertible Neural Networks with Masked Convolutions

We propose a new way of constructing invertible neural networks by combi...
research
06/20/2020

Coupling-based Invertible Neural Networks Are Universal Diffeomorphism Approximators

Invertible neural networks based on coupling flows (CF-INNs) have variou...
research
09/28/2022

ButterflyFlow: Building Invertible Layers with Butterfly Matrices

Normalizing flows model complex probability distributions using maps obt...

Please sign up or login with your details

Forgot password? Click here to reset