Quasi-Autoregressive Residual (QuAR) Flows

by   Achintya Gopal, et al.

Normalizing Flows are a powerful technique for learning and modeling probability distributions given samples from those distributions. The current state of the art results are built upon residual flows as these can model a larger hypothesis space than coupling layers. However, residual flows are extremely computationally expensive both to train and to use, which limits their applicability in practice. In this paper, we introduce a simplification to residual flows using a Quasi-Autoregressive (QuAR) approach. Compared to the standard residual flow approach, this simplification retains many of the benefits of residual flows while dramatically reducing the compute time and memory requirements, thus making flow-based modeling approaches far more tractable and broadening their potential applicability.


page 10

page 11

page 12

page 13

page 14


ELF: Exact-Lipschitz Based Universal Density Approximator Flow

Normalizing flows have grown more popular over the last few years; howev...

Equivariant Discrete Normalizing Flows

At its core, generative modeling seeks to uncover the underlying factors...

CaloFlow: Fast and Accurate Generation of Calorimeter Showers with Normalizing Flows

We introduce CaloFlow, a fast detector simulation framework based on nor...

Sinusoidal Flow: A Fast Invertible Autoregressive Flow

Normalising flows offer a flexible way of modelling continuous probabili...

Autoregressive Quantile Flows for Predictive Uncertainty Estimation

Numerous applications of machine learning involve predicting flexible pr...

Implicit Normalizing Flows

Normalizing flows define a probability distribution by an explicit inver...

Improving Variational Auto-Encoders using convex combination linear Inverse Autoregressive Flow

In this paper, we propose a new volume-preserving flow and show that it ...