Linformer: Self-Attention with Linear Complexity

06/08/2020
by   Sinong Wang, et al.
17

Large transformer models have shown extraordinary success in achieving state-of-the-art results in many natural language processing applications. However, training and deploying these models can be prohibitively costly for long sequences, as the standard self-attention mechanism of the Transformer uses O(n^2) time and space with respect to sequence length. In this paper, we demonstrate that the self-attention mechanism can be approximated by a low-rank matrix. We further exploit this finding to propose a new self-attention mechanism, which reduces the overall self-attention complexity from O(n^2) to O(n) in both time and space. The resulting linear transformer, the Linformer, performs on par with standard Transformer models, while being much more memory- and time-efficient.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/15/2023

Attention Is Not All You Need Anymore

In recent years, the popular Transformer architecture has achieved great...
research
12/27/2019

IS Attention All What You Need? – An Empirical Investigation on Convolution-Based Active Memory and Self-Attention

The key to a Transformer model is the self-attention mechanism, which al...
research
09/11/2022

On The Computational Complexity of Self-Attention

Transformer architectures have led to remarkable progress in many state-...
research
06/24/2019

A Tensorized Transformer for Language Modeling

Latest development of neural models has connected the encoder and decode...
research
08/10/2021

Adaptive Multi-Resolution Attention with Linear Complexity

Transformers have improved the state-of-the-art across numerous tasks in...
research
11/15/2019

Sequential Recommendation with Relation-Aware Kernelized Self-Attention

Recent studies identified that sequential Recommendation is improved by ...
research
02/04/2022

Temporal Attention for Language Models

Pretrained language models based on the transformer architecture have sh...

Please sign up or login with your details

Forgot password? Click here to reset