Beyond English-only Reading Comprehension: Experiments in Zero-Shot Multilingual Transfer for Bulgarian

08/05/2019
by   Momchil Hardalov, et al.
0

Recently, reading comprehension models achieved near-human performance on large-scale datasets such as SQuAD, CoQA, MS Macro, RACE, etc. This is largely due to the release of pre-trained contextualized representations such as BERT and ELMo, which can be fine-tuned for the target task. Despite those advances and the creation of more challenging datasets, most of the work is still done for English. Here, we study the effectiveness of multilingual BERT fine-tuned on large-scale English datasets for reading comprehension (e.g., for RACE), and we apply it to Bulgarian multiple-choice reading comprehension. We propose a new dataset containing 2,221 questions from matriculation exams for twelfth grade in various subjects -history, biology, geography and philosophy-, and 412 additional questions from online quizzes in history. While the quiz authors gave no relevant context, we incorporate knowledge from Wikipedia, retrieving documents matching the combination of question + each answer option. Moreover, we experiment with different indexing and pre-training strategies. The evaluation results show accuracy of 42.23 24.89

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/18/2023

YORC: Yoruba Reading Comprehension dataset

In this paper, we create YORC: a new multi-choice Yoruba Reading Compreh...
research
10/28/2019

What does BERT Learn from Multiple-Choice Reading Comprehension Datasets?

Multiple-Choice Reading Comprehension (MCRC) requires the model to read ...
research
01/25/2021

English Machine Reading Comprehension Datasets: A Survey

This paper surveys 54 English Machine Reading Comprehension datasets, wi...
research
10/31/2018

Improving Machine Reading Comprehension with General Reading Strategies

Reading strategies have been shown to improve comprehension levels, espe...
research
08/14/2019

SG-Net: Syntax-Guided Machine Reading Comprehension

For machine reading comprehension, how to effectively model the linguist...
research
05/27/2023

A Practical Toolkit for Multilingual Question and Answer Generation

Generating questions along with associated answers from a text has appli...
research
08/09/2021

BERT-based distractor generation for Swedish reading comprehension questions using a small-scale dataset

An important part when constructing multiple-choice questions (MCQs) for...

Please sign up or login with your details

Forgot password? Click here to reset