Do Transformer Modifications Transfer Across Implementations and Applications?

02/23/2021
by   Sharan Narang, et al.
10

The research community has proposed copious modifications to the Transformer architecture since it was introduced over three years ago, relatively few of which have seen widespread adoption. In this paper, we comprehensively evaluate many of these modifications in a shared experimental setting that covers most of the common uses of the Transformer in natural language processing. Surprisingly, we find that most modifications do not meaningfully improve performance. Furthermore, most of the Transformer variants we found beneficial were either developed in the same codebase that we used or are relatively minor changes. We conjecture that performance improvements may strongly depend on implementation details and correspondingly make some recommendations for improving the generality of experimental results.

READ FULL TEXT

page 1

page 2

page 3

page 4

03/18/2019

Neutron: An Implementation of the Transformer Translation Model and its Variants

The Transformer translation model is easier to parallelize and provides ...
05/03/2022

Better plain ViT baselines for ImageNet-1k

It is commonly accepted that the Vision Transformer model requires sophi...
10/28/2021

Pruning Attention Heads of Transformer Models Using A* Search: A Novel Approach to Compress Big NLP Architectures

Recent years have seen a growing adoption of Transformer models such as ...
10/30/2019

An Augmented Transformer Architecture for Natural Language Generation Tasks

The Transformer based neural networks have been showing significant adva...
02/09/2021

Point Cloud Transformers applied to Collider Physics

Methods for processing point cloud information have seen a great success...
03/21/2022

Sequential algorithmic modification with test data reuse

After initial release of a machine learning algorithm, the model can be ...
02/02/2012

Resolving Implementation Ambiguity and Improving SURF

Speeded Up Robust Features (SURF) has emerged as one of the more popular...