Linear Self-Attention Approximation via Trainable Feedforward Kernel

11/08/2022
by   Uladzislau Yorsh, et al.
0

In pursuit of faster computation, Efficient Transformers demonstrate an impressive variety of approaches – models attaining sub-quadratic attention complexity can utilize a notion of sparsity or a low-rank approximation of inputs to reduce the number of attended keys; other ways to reduce complexity include locality-sensitive hashing, key pooling, additional memory to store information in compacted or hybridization with other architectures, such as CNN. Often based on a strong mathematical basis, kernelized approaches allow for the approximation of attention with linear complexity while retaining high accuracy. Therefore, in the present paper, we aim to expand the idea of trainable kernel methods to approximate the self-attention mechanism of the Transformer architecture.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/09/2021

Beyond Nyströmformer – Approximation of self-attention by Spectral Shifting

Transformer is a powerful tool for many natural language tasks which is ...
research
09/11/2022

On The Computational Complexity of Self-Attention

Transformer architectures have led to remarkable progress in many state-...
research
07/12/2021

Combiner: Full Attention Transformer with Sparse Computation Cost

Transformers provide a class of expressive architectures that are extrem...
research
04/10/2022

Linear Complexity Randomized Self-attention Mechanism

Recently, random feature attentions (RFAs) are proposed to approximate t...
research
06/27/2023

FLuRKA: Fast fused Low-Rank Kernel Attention

Many efficient approximate self-attention techniques have become prevale...
research
09/01/2022

Sparse Attention Acceleration with Synergistic In-Memory Pruning and On-Chip Recomputation

As its core computation, a self-attention mechanism gauges pairwise corr...
research
05/31/2023

Primal-Attention: Self-attention through Asymmetric Kernel SVD in Primal Representation

Recently, a new line of works has emerged to understand and improve self...

Please sign up or login with your details

Forgot password? Click here to reset