ReadTwice: Reading Very Large Documents with Memories

05/10/2021
by   Yury Zemlyanskiy, et al.
0

Knowledge-intensive tasks such as question answering often require assimilating information from different sections of large inputs such as books or article collections. We propose ReadTwice, a simple and effective technique that combines several strengths of prior approaches to model long-range dependencies with Transformers. The main idea is to read text in small segments, in parallel, summarizing each segment into a memory table to be used in a second read of the text. We show that the method outperforms models of comparable size on several question answering (QA) datasets and sets a new state of the art on the challenging NarrativeQA task, with questions about entire books. Source code and pre-trained checkpoints for ReadTwice can be found at https://goo.gle/research-readtwice.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/10/2021

Pre-trained Transformer-Based Approach for Arabic Question Answering : A Comparative Study

Question answering(QA) is one of the most challenging yet widely investi...
research
01/14/2021

TSQA: Tabular Scenario Based Question Answering

Scenario-based question answering (SQA) has attracted an increasing rese...
research
12/04/2019

An Exploration of Data Augmentation and Sampling Techniques for Domain-Agnostic Question Answering

To produce a domain-agnostic question answering model for the Machine Re...
research
05/12/2023

A Memory Model for Question Answering from Streaming Data Supported by Rehearsal and Anticipation of Coreference Information

Existing question answering methods often assume that the input content ...
research
10/11/2022

Capturing Global Structural Information in Long Document Question Answering with Compressive Graph Selector Network

Long document question answering is a challenging task due to its demand...
research
06/09/2016

Key-Value Memory Networks for Directly Reading Documents

Directly reading documents and being able to answer questions from them ...
research
12/15/2022

Saved You A Click: Automatically Answering Clickbait Titles

Often clickbait articles have a title that is phrased as a question or v...

Please sign up or login with your details

Forgot password? Click here to reset