i-DenseNets

10/05/2020
by   Yura Perugachi-Diaz, et al.
0

We introduce Invertible Dense Networks (i-DenseNets), a more parameter efficient alternative to Residual Flows. The method relies on an analysis of the Lipschitz continuity of the concatenation in DenseNets, where we enforce the invertibility of the network by satisfying the Lipschitz constraint. Additionally, we extend this method by proposing a learnable concatenation, which not only improves the model performance but also indicates the importance of the concatenated representation. We demonstrate the performance of i-DenseNets and Residual Flows on toy, MNIST, and CIFAR10 data. Both i-DenseNets outperform Residual Flows evaluated in negative log-likelihood, on all considered datasets under an equal parameter budget.

READ FULL TEXT

Authors

page 5

page 10

page 11

page 12

02/04/2021

Invertible DenseNets with Concatenated LipSwish

We introduce Invertible Dense Networks (i-DenseNets), a more parameter e...
12/13/2021

ELF: Exact-Lipschitz Based Universal Density Approximator Flow

Normalizing flows have grown more popular over the last few years; howev...
03/10/2021

Universal Approximation of Residual Flows in Maximum Mean Discrepancy

Normalizing flows are a class of flexible deep generative models that of...
06/06/2019

Residual Flows for Invertible Generative Modeling

Flow-based generative models parameterize probability distributions thro...
07/15/2021

On the expressivity of bi-Lipschitz normalizing flows

An invertible function is bi-Lipschitz if both the function and its inve...
10/16/2021

Equivariant Discrete Normalizing Flows

At its core, generative modeling seeks to uncover the underlying factors...
02/12/2021

Jacobian Determinant of Normalizing Flows

Normalizing flows learn a diffeomorphic mapping between the target and b...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.