i-DenseNets

10/05/2020
by   Yura Perugachi-Diaz, et al.
0

We introduce Invertible Dense Networks (i-DenseNets), a more parameter efficient alternative to Residual Flows. The method relies on an analysis of the Lipschitz continuity of the concatenation in DenseNets, where we enforce the invertibility of the network by satisfying the Lipschitz constraint. Additionally, we extend this method by proposing a learnable concatenation, which not only improves the model performance but also indicates the importance of the concatenated representation. We demonstrate the performance of i-DenseNets and Residual Flows on toy, MNIST, and CIFAR10 data. Both i-DenseNets outperform Residual Flows evaluated in negative log-likelihood, on all considered datasets under an equal parameter budget.

READ FULL TEXT

page 5

page 10

page 11

page 12

research
02/04/2021

Invertible DenseNets with Concatenated LipSwish

We introduce Invertible Dense Networks (i-DenseNets), a more parameter e...
research
12/13/2021

ELF: Exact-Lipschitz Based Universal Density Approximator Flow

Normalizing flows have grown more popular over the last few years; howev...
research
03/10/2021

Universal Approximation of Residual Flows in Maximum Mean Discrepancy

Normalizing flows are a class of flexible deep generative models that of...
research
06/06/2019

Residual Flows for Invertible Generative Modeling

Flow-based generative models parameterize probability distributions thro...
research
07/15/2021

On the expressivity of bi-Lipschitz normalizing flows

An invertible function is bi-Lipschitz if both the function and its inve...
research
06/02/2020

The Convolution Exponential and Generalized Sylvester Flows

This paper introduces a new method to build linear flows, by taking the ...
research
03/17/2021

Implicit Normalizing Flows

Normalizing flows define a probability distribution by an explicit inver...

Please sign up or login with your details

Forgot password? Click here to reset