Global memory transformer for processing long documents

12/03/2022
by   Arij Al Adel, et al.
0

Transformer variants dominate the state-of-the-art in different natural language processing tasks such as translation, reading comprehension and summarization. Our paper is more directed to use general memory slots added to the inputs and studying the results of adding these slots. This paper is a go on study of general memory slots rule that were added to the input of the proposed model in previous work. We have two main tasks;1) pretraining task using masked language modeling and b) fine tuning task using HotpotQA . This study aims to verify the ability of the proposed model to handle chunks as if they were one chunk comparing with the base model. As baseline we used T5 transformer. We studied the rule of memory slots augmented to each input chunk and studied the model performance without selector. We found that adding memory to input chunks helped the proposed model to overcome the baseline on Masked language modeling task with specific training parameters. Ablation study reveals the ability of using the compressed input chunks with a degradation in performance.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/20/2020

Memory Transformer

Transformer-based models have achieved state-of-the-art results in many ...
research
06/05/2020

GMAT: Global Memory Augmentation for Transformers

Transformer-based models have become ubiquitous in natural language proc...
research
08/08/2022

Investigating Efficiently Extending Transformers for Long Input Summarization

While large pretrained Transformer models have proven highly capable at ...
research
12/15/2022

Efficient Long Sequence Modeling via State Space Augmented Transformer

Transformer models have achieved superior performance in various natural...
research
09/16/2021

Regularized Training of Nearest Neighbor Language Models

Including memory banks in a natural language processing architecture inc...
research
06/15/2023

Recurrent Memory Decision Transformer

Transformative models, originally developed for natural language problem...
research
09/09/2019

Picture What you Read

Visualization refers to our ability to create an image in our head based...

Please sign up or login with your details

Forgot password? Click here to reset