DeepAI AI Chat
Log In Sign Up

Transformer with Memory Replay

by   Rui Liu, et al.
University of Michigan

Transformers achieve state-of-the-art performance for natural language processing tasks by pre-training on large-scale text corpora. They are extremely compute-intensive and have very high sample complexity. Memory replay is a mechanism that remembers and reuses past examples by saving to and replaying from a memory buffer. It has been successfully used in reinforcement learning and GANs due to better sample efficiency. In this paper, we propose Transformer with Memory Replay (TMR), which integrates memory replay with transformer, making transformer more sample-efficient. Experiments on GLUE and SQuAD benchmark datasets show that Transformer with Memory Replay achieves at least 1% point increase compared to the baseline transformer model when pretrained with the same number of examples. Further, by adopting a careful design that reduces the wall-clock time overhead of memory replay, we also empirically achieve a better runtime efficiency.


page 1

page 2

page 3

page 4


Memory-efficient Reinforcement Learning with Knowledge Consolidation

Artificial neural networks are promising as general function approximato...

The Effectiveness of Memory Replay in Large Scale Continual Learning

We study continual learning in the large scale setting where tasks in th...

A Dual Memory Structure for Efficient Use of Replay Memory in Deep Reinforcement Learning

In this paper, we propose a dual memory structure for reinforcement lear...

Improving Computational Efficiency in Visual Reinforcement Learning via Stored Embeddings

Recent advances in off-policy deep reinforcement learning (RL) have led ...

Sample Efficiency in Sparse Reinforcement Learning: Or Your Money Back

Sparse rewards present a difficult problem in reinforcement learning and...

CHEX: Multiversion Replay with Ordered Checkpoints

In scientific computing and data science disciplines, it is often necess...

TurboTransformers: An Efficient GPU Serving System For Transformer Models

The transformer is the most critical algorithm innovation of the Nature ...