Attentive Contractive Flow: Improved Contractive Flows with Lipschitz-constrained Self-Attention

09/24/2021
by   Avideep Mukherjee, et al.
5

Normalizing flows provide an elegant method for obtaining tractable density estimates from distributions by using invertible transformations. The main challenge is to improve the expressivity of the models while keeping the invertibility constraints intact. We propose to do so via the incorporation of localized self-attention. However, conventional self-attention mechanisms don't satisfy the requirements to obtain invertible flows and can't be naively incorporated into normalizing flows. To address this, we introduce a novel approach called Attentive Contractive Flow (ACF) which utilizes a special category of flow-based generative models - contractive flows. We demonstrate that ACF can be introduced into a variety of state of the art flow models in a plug-and-play manner. This is demonstrated to not only improve the representation power of these models (improving on the bits per dim metric), but also to results in significantly faster convergence in training them. Qualitative results, including interpolations between test images, demonstrate that samples are more realistic and capture local correlations in the data well. We evaluate the results further by performing perturbation analysis using AWGN demonstrating that ACF models (especially the dot-product variant) show better and more consistent resilience to additive noise.

READ FULL TEXT

page 5

page 6

page 12

page 13

page 14

research
06/07/2021

Generative Flows with Invertible Attentions

Flow-based generative models have shown excellent ability to explicitly ...
research
05/31/2020

The Expressive Power of a Class of Normalizing Flow Models

Normalizing flows have received a great deal of recent attention as they...
research
11/08/2021

E(2) Equivariant Self-Attention for Radio Astronomy

In this work we introduce group-equivariant self-attention models to add...
research
12/03/2019

Multiscale Self Attentive Convolutions for Vision and Language Modeling

Self attention mechanisms have become a key building block in many state...
research
06/06/2019

Residual Flows for Invertible Generative Modeling

Flow-based generative models parameterize probability distributions thro...
research
07/31/2023

Generative models for wearables data

Data scarcity is a common obstacle in medical research due to the high c...
research
10/24/2020

Blind Deinterleaving of Signals in Time Series with Self-attention Based Soft Min-cost Flow Learning

We propose an end-to-end learning approach to address deinterleaving of ...

Please sign up or login with your details

Forgot password? Click here to reset