Hierarchical Transformers Are More Efficient Language Models

10/26/2021
by   Piotr Nawrot, et al.
46

Transformer models yield impressive results on many NLP and sequence modeling tasks. Remarkably, Transformers can handle long sequences which allows them to produce long coherent outputs: full paragraphs produced by GPT-3 or well-structured images produced by DALL-E. These large language models are impressive but also very inefficient and costly, which limits their applications and accessibility. We postulate that having an explicit hierarchical architecture is the key to Transformers that efficiently handle long sequences. To verify this claim, we first study different ways to downsample and upsample activations in Transformers so as to make them hierarchical. We use the best performing upsampling and downsampling layers to create Hourglass - a hierarchical Transformer language model. Hourglass improves upon the Transformer baseline given the same amount of computation and can yield the same results as Transformers more efficiently. In particular, Hourglass sets new state-of-the-art for Transformer models on the ImageNet32 generation task and improves language modeling efficiency on the widely studied enwik8 benchmark.

READ FULL TEXT

page 3

page 5

research
11/24/2021

Sparse is Enough in Scaling Transformers

Large Transformer models yield impressive results on many tasks, but are...
research
12/01/2020

Modifying Memories in Transformer Models

Large Transformer models have achieved impressive performance in many na...
research
05/01/2020

Multi-scale Transformer Language Models

We investigate multi-scale transformer language models that learn repres...
research
01/13/2020

Reformer: The Efficient Transformer

Large Transformer models routinely achieve state-of-the-art results on a...
research
04/12/2022

What do Toothbrushes do in the Kitchen? How Transformers Think our World is Structured

Transformer-based models are now predominant in NLP. They outperform app...
research
05/03/2022

Mixed-effects transformers for hierarchical adaptation

Language use differs dramatically from context to context. To some degre...
research
04/17/2019

Dynamic Evaluation of Transformer Language Models

This research note combines two methods that have recently improved the ...

Please sign up or login with your details

Forgot password? Click here to reset