FInC Flow: Fast and Invertible k × k Convolutions for Normalizing Flows

01/23/2023
by   Aditya Kallappa, et al.
0

Invertible convolutions have been an essential element for building expressive normalizing flow-based generative models since their introduction in Glow. Several attempts have been made to design invertible k × k convolutions that are efficient in training and sampling passes. Though these attempts have improved the expressivity and sampling efficiency, they severely lagged behind Glow which used only 1 × 1 convolutions in terms of sampling time. Also, many of the approaches mask a large number of parameters of the underlying convolution, resulting in lower expressivity on a fixed run-time budget. We propose a k × k convolutional layer and Deep Normalizing Flow architecture which i.) has a fast parallel inversion algorithm with running time O(n k^2) (n is height and width of the input image and k is kernel size), ii.) masks the minimal amount of learnable parameters in a layer. iii.) gives better forward pass and sampling times comparable to other k × k convolution-based models on real-world benchmarks. We provide an implementation of the proposed parallel algorithm for sampling using our invertible convolutions on GPUs. Benchmarks on CIFAR-10, ImageNet, and CelebA datasets show comparable performance to previous works regarding bits per dimension while significantly improving the sampling time.

READ FULL TEXT

page 8

page 9

research
05/24/2019

Generative Flow via Invertible nxn Convolution

Flow-based generative models have recently become one of the most effici...
research
01/30/2019

Emerging Convolutions for Generative Normalizing Flows

Generative flows are attractive because they admit exact likelihood opti...
research
07/03/2021

CInC Flow: Characterizable Invertible 3x3 Convolution

Normalizing flows are an essential alternative to GANs for generative mo...
research
02/27/2020

Woodbury Transformations for Deep Generative Flows

Normalizing flows are deep generative models that allow efficient likeli...
research
02/06/2022

Hyper-Convolutions via Implicit Kernels for Medical Imaging

The convolutional neural network (CNN) is one of the most commonly used ...
research
07/18/2019

MintNet: Building Invertible Neural Networks with Masked Convolutions

We propose a new way of constructing invertible neural networks by combi...
research
01/30/2021

NL-CNN: A Resources-Constrained Deep Learning Model based on Nonlinear Convolution

A novel convolution neural network model, abbreviated NL-CNN is proposed...

Please sign up or login with your details

Forgot password? Click here to reset