NLP-IIS@UT at SemEval-2021 Task 4: Machine Reading Comprehension using the Long Document Transformer

by   Hossein Basafa, et al.

This paper presents a technical report of our submission to the 4th task of SemEval-2021, titled: Reading Comprehension of Abstract Meaning. In this task, we want to predict the correct answer based on a question given a context. Usually, contexts are very lengthy and require a large receptive field from the model. Thus, common contextualized language models like BERT miss fine representation and performance due to the limited capacity of the input tokens. To tackle this problem, we used the Longformer model to better process the sequences. Furthermore, we utilized the method proposed in the Longformer benchmark on Wikihop dataset which improved the accuracy on our task data from 23.01 to 70.30



page 4


Multi Document Reading Comprehension

Reading Comprehension (RC) is a task of answering a question from a give...

ZJUKLAB at SemEval-2021 Task 4: Negative Augmentation with Language Model for Reading Comprehension of Abstract Meaning

This paper presents our systems for the three Subtasks of SemEval Task4:...

VAULT: VAriable Unified Long Text Representation for Machine Reading Comprehension

Existing models on Machine Reading Comprehension (MRC) require complex m...

Exploring Probabilistic Soft Logic as a framework for integrating top-down and bottom-up processing of language in a task context

This technical report describes a new prototype architecture designed to...

ReCAM@IITK at SemEval-2021 Task 4: BERT and ALBERT based Ensemble for Abstract Word Prediction

This paper describes our system for Task 4 of SemEval-2021: Reading Comp...

Recurrent Chunking Mechanisms for Long-Text Machine Reading Comprehension

In this paper, we study machine reading comprehension (MRC) on long text...

Automated Scoring for Reading Comprehension via In-context BERT Tuning

Automated scoring of open-ended student responses has the potential to s...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.