Quasi-Autoregressive Residual (QuAR) Flows

09/16/2020
by   Achintya Gopal, et al.
0

Normalizing Flows are a powerful technique for learning and modeling probability distributions given samples from those distributions. The current state of the art results are built upon residual flows as these can model a larger hypothesis space than coupling layers. However, residual flows are extremely computationally expensive both to train and to use, which limits their applicability in practice. In this paper, we introduce a simplification to residual flows using a Quasi-Autoregressive (QuAR) approach. Compared to the standard residual flow approach, this simplification retains many of the benefits of residual flows while dramatically reducing the compute time and memory requirements, thus making flow-based modeling approaches far more tractable and broadening their potential applicability.

READ FULL TEXT

page 10

page 11

page 12

page 13

page 14

12/13/2021

ELF: Exact-Lipschitz Based Universal Density Approximator Flow

Normalizing flows have grown more popular over the last few years; howev...
10/16/2021

Equivariant Discrete Normalizing Flows

At its core, generative modeling seeks to uncover the underlying factors...
06/09/2021

CaloFlow: Fast and Accurate Generation of Calorimeter Showers with Normalizing Flows

We introduce CaloFlow, a fast detector simulation framework based on nor...
10/26/2021

Sinusoidal Flow: A Fast Invertible Autoregressive Flow

Normalising flows offer a flexible way of modelling continuous probabili...
12/09/2021

Autoregressive Quantile Flows for Predictive Uncertainty Estimation

Numerous applications of machine learning involve predicting flexible pr...
03/17/2021

Implicit Normalizing Flows

Normalizing flows define a probability distribution by an explicit inver...
06/07/2017

Improving Variational Auto-Encoders using convex combination linear Inverse Autoregressive Flow

In this paper, we propose a new volume-preserving flow and show that it ...