Invertible Attention

06/16/2021
by   Jiajun Zha, et al.
19

Attention has been proved to be an efficient mechanism to capture long-range dependencies. However, so far it has not been deployed in invertible networks. This is due to the fact that in order to make a network invertible, every component within the network needs to be a bijective transformation, but a normal attention block is not. In this paper, we propose invertible attention that can be plugged into existing invertible models. We mathematically and experimentally prove that the invertibility of an attention model can be achieved by carefully constraining its Lipschitz constant. We validate the invertibility of our invertible attention on image reconstruction task with 3 popular datasets: CIFAR-10, SVHN, and CelebA. We also show that our invertible attention achieves similar performance in comparison with normal non-invertible attention on dense prediction tasks.

READ FULL TEXT

page 6

page 7

page 9

research
01/21/2018

Dense Recurrent Neural Networks for Scene Labeling

Recently recurrent neural networks (RNNs) have demonstrated the ability ...
research
12/14/2021

Explore Long-Range Context feature for Speaker Verification

Capturing long-range dependency and modeling long temporal contexts is p...
research
03/08/2021

Lipschitz Normalization for Self-Attention Layers with Application to Graph Neural Networks

Attention based neural networks are state of the art in a large range of...
research
04/03/2018

Bi-Directional Block Self-Attention for Fast and Memory-Efficient Sequence Modeling

Recurrent neural networks (RNN), convolutional neural networks (CNN) and...
research
10/27/2018

A^2-Nets: Double Attention Networks

Learning to capture long-range relations is fundamental to image/video r...
research
02/10/2021

On the Regularity of Attention

Attention is a powerful component of modern neural networks across a wid...
research
11/23/2021

SimpleTron: Eliminating Softmax from Attention Computation

In this paper, we propose that the dot product pairwise matching attenti...

Please sign up or login with your details

Forgot password? Click here to reset