-
ETC: Encoding Long and Structured Data in Transformers
Transformer-based models have pushed the state of the art in many natura...
read it
-
O(n) Connections are Expressive Enough: Universal Approximability of Sparse Transformers
Transformer networks use pairwise attention to compute contextual embedd...
read it
-
Time-based Sequence Model for Personalization and Recommendation Systems
In this paper we develop a novel recommendation model that explicitly in...
read it
-
Fast Transformers with Clustered Attention
Transformers have been proven a successful model for a variety of tasks ...
read it
-
On the Computational Power of Transformers and Its Implications in Sequence Modeling
Transformers are being used extensively across several sequence modeling...
read it
-
A Cheap Linear Attention Mechanism with Fast Lookups and Fixed-Size Representations
The softmax content-based attention mechanism has proven to be very bene...
read it
-
Masked Language Modeling for Proteins via Linearly Scalable Long-Context Transformers
Transformer models have achieved state-of-the-art results across a diver...
read it
Big Bird: Transformers for Longer Sequences
Transformers-based models, such as BERT, have been one of the most successful deep learning models for NLP. Unfortunately, one of their core limitations is the quadratic dependency (mainly in terms of memory) on the sequence length due to their full attention mechanism. To remedy this, we propose, BigBird, a sparse attention mechanism that reduces this quadratic dependency to linear. We show that BigBird is a universal approximator of sequence functions and is Turing complete, thereby preserving these properties of the quadratic, full attention model. Along the way, our theoretical analysis reveals some of the benefits of having O(1) global tokens (such as CLS), that attend to the entire sequence as part of the sparse attention mechanism. The proposed sparse attention can handle sequences of length up to 8x of what was previously possible using similar hardware. As a consequence of the capability to handle longer context, BigBird drastically improves performance on various NLP tasks such as question answering and summarization. We also propose novel applications to genomics data.
READ FULL TEXT
Comments
There are no comments yet.