Generative Flows with Invertible Attentions

06/07/2021
by   Rhea Sanjay Sukthanker, et al.
1

Flow-based generative models have shown excellent ability to explicitly learn the probability density function of data via a sequence of invertible transformations. Yet, modeling long-range dependencies over normalizing flows remains understudied. To fill the gap, in this paper, we introduce two types of invertible attention mechanisms for generative flow models. To be precise, we propose map-based and scaled dot-product attention for unconditional and conditional generative flow models. The key idea is to exploit split-based attention mechanisms to learn the attention weights and input representations on every two splits of flow feature maps. Our method provides invertible attention modules with tractable Jacobian determinants, enabling seamless integration of it at any positions of the flow-based models. The proposed attention mechanism can model the global data dependencies, leading to more comprehensive flow models. Evaluation on multiple generation tasks demonstrates that the introduced attention flow idea results in efficient flow models and compares favorably against the state-of-the-art unconditional and conditional generative flow methods.

READ FULL TEXT

page 5

page 7

page 9

page 15

page 16

page 17

page 18

page 19

research
04/08/2020

Normalizing Flows with Multi-Scale Autoregressive Priors

Flow-based generative models are an important class of exact inference m...
research
09/24/2021

Attentive Contractive Flow: Improved Contractive Flows with Lipschitz-constrained Self-Attention

Normalizing flows provide an elegant method for obtaining tractable dens...
research
09/16/2018

f-VAEs: Improve VAEs with Conditional Flows

In this paper, we integrate VAEs and flow-based generative models succes...
research
07/24/2021

Discrete Denoising Flows

Discrete flow-based models are a recently proposed class of generative m...
research
05/03/2023

Nonparametric Generative Modeling with Conditional and Locally-Connected Sliced-Wasserstein Flows

Sliced-Wasserstein Flow (SWF) is a promising approach to nonparametric g...
research
09/30/2020

RG-Flow: A hierarchical and explainable flow model based on renormalization group and sparse prior

Flow-based generative models have become an important class of unsupervi...
research
03/23/2021

Out-of-Distribution Detection of Melanoma using Normalizing Flows

Generative modelling has been a topic at the forefront of machine learni...

Please sign up or login with your details

Forgot password? Click here to reset