Woodbury Transformations for Deep Generative Flows

02/27/2020
by   You Lu, et al.
0

Normalizing flows are deep generative models that allow efficient likelihood calculation and sampling. The core requirement for this advantage is that they are constructed using functions that can be efficiently inverted and for which the determinant of the function's Jacobian can be efficiently computed. Researchers have introduced various such flow operations, but few of these allow rich interactions among variables without incurring significant computational costs. In this paper, we introduce Woodbury transformations, which achieve efficient invertibility via the Woodbury matrix identity and efficient determinant calculation via Sylvester's determinant identity. In contrast with other operations used in state-of-the-art normalizing flows, Woodbury transformations enable (1) high-dimensional interactions, (2) efficient sampling, and (3) efficient likelihood evaluation. Other similar operations, such as 1x1 convolutions, emerging convolutions, or periodic convolutions allow at most two of these three advantages. In our experiments on multiple image datasets, we find that Woodbury transformations allow learning of higher-likelihood models than other flow architectures while still enjoying their efficiency advantages.

READ FULL TEXT

page 8

page 12

page 13

page 14

page 15

page 16

research
01/30/2019

Emerging Convolutions for Generative Normalizing Flows

Generative flows are attractive because they admit exact likelihood opti...
research
07/06/2020

SurVAE Flows: Surjections to Bridge the Gap between VAEs and Flows

Normalizing flows and variational autoencoders are powerful generative m...
research
07/03/2021

CInC Flow: Characterizable Invertible 3x3 Convolution

Normalizing flows are an essential alternative to GANs for generative mo...
research
06/15/2020

Why Normalizing Flows Fail to Detect Out-of-Distribution Data

Detecting out-of-distribution (OOD) data is crucial for robust machine l...
research
01/23/2023

FInC Flow: Fast and Invertible k × k Convolutions for Normalizing Flows

Invertible convolutions have been an essential element for building expr...
research
11/14/2020

Self Normalizing Flows

Efficient gradient computation of the Jacobian determinant term is a cor...
research
05/19/2023

Generative Sliced MMD Flows with Riesz Kernels

Maximum mean discrepancy (MMD) flows suffer from high computational cost...

Please sign up or login with your details

Forgot password? Click here to reset