Attentive Contractive Flow: Improved Contractive Flows with Lipschitz-constrained Self-Attention

09/24/2021
by   Avideep Mukherjee, et al.
5

Normalizing flows provide an elegant method for obtaining tractable density estimates from distributions by using invertible transformations. The main challenge is to improve the expressivity of the models while keeping the invertibility constraints intact. We propose to do so via the incorporation of localized self-attention. However, conventional self-attention mechanisms don't satisfy the requirements to obtain invertible flows and can't be naively incorporated into normalizing flows. To address this, we introduce a novel approach called Attentive Contractive Flow (ACF) which utilizes a special category of flow-based generative models - contractive flows. We demonstrate that ACF can be introduced into a variety of state of the art flow models in a plug-and-play manner. This is demonstrated to not only improve the representation power of these models (improving on the bits per dim metric), but also to results in significantly faster convergence in training them. Qualitative results, including interpolations between test images, demonstrate that samples are more realistic and capture local correlations in the data well. We evaluate the results further by performing perturbation analysis using AWGN demonstrating that ACF models (especially the dot-product variant) show better and more consistent resilience to additive noise.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 5

page 6

page 12

page 13

page 14

06/07/2021

Generative Flows with Invertible Attentions

Flow-based generative models have shown excellent ability to explicitly ...
05/31/2020

The Expressive Power of a Class of Normalizing Flow Models

Normalizing flows have received a great deal of recent attention as they...
11/08/2021

E(2) Equivariant Self-Attention for Radio Astronomy

In this work we introduce group-equivariant self-attention models to add...
12/03/2019

Multiscale Self Attentive Convolutions for Vision and Language Modeling

Self attention mechanisms have become a key building block in many state...
10/24/2020

Blind Deinterleaving of Signals in Time Series with Self-attention Based Soft Min-cost Flow Learning

We propose an end-to-end learning approach to address deinterleaving of ...
06/24/2021

Distilling the Knowledge from Normalizing Flows

Normalizing flows are a powerful class of generative models demonstratin...
04/17/2020

Highway Transformer: Self-Gating Enhanced Self-Attentive Networks

Self-attention mechanisms have made striking state-of-the-art (SOTA) pro...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.