Improving Low-resource Reading Comprehension via Cross-lingual Transposition Rethinking

07/11/2021
by   Gaochen Wu, et al.
0

Extractive Reading Comprehension (ERC) has made tremendous advances enabled by the availability of large-scale high-quality ERC training data. Despite of such rapid progress and widespread application, the datasets in languages other than high-resource languages such as English remain scarce. To address this issue, we propose a Cross-Lingual Transposition ReThinking (XLTT) model by modelling existing high-quality extractive reading comprehension datasets in a multilingual environment. To be specific, we present multilingual adaptive attention (MAA) to combine intra-attention and inter-attention to learn more general generalizable semantic and lexical knowledge from each pair of language families. Furthermore, to make full use of existing datasets, we adopt a new training framework to train our model by calculating task-level similarities between each existing dataset and target dataset. The experimental results show that our XLTT model surpasses six baselines on two multilingual ERC benchmarks, especially more effective for low-resource languages with 3.9 and 4.1 average improvement in F1 and EM, respectively.

READ FULL TEXT
research
05/31/2021

A Multilingual Modeling Method for Span-Extraction Reading Comprehension

Span-extraction reading comprehension models have made tremendous advanc...
research
09/01/2019

Cross-Lingual Machine Reading Comprehension

Though the community has made great progress on Machine Reading Comprehe...
research
08/31/2023

The Belebele Benchmark: a Parallel Reading Comprehension Dataset in 122 Language Variants

We present Belebele, a multiple-choice machine reading comprehension (MR...
research
02/26/2023

Cross-Lingual Question Answering over Knowledge Base as Reading Comprehension

Although many large-scale knowledge bases (KBs) claim to contain multili...
research
08/15/2019

XCMRC: Evaluating Cross-lingual Machine Reading Comprehension

We present XCMRC, the first public cross-lingual language understanding ...
research
12/01/2021

Zero-Shot Cross-Lingual Machine Reading Comprehension via Inter-Sentence Dependency Graph

We target the task of cross-lingual Machine Reading Comprehension (MRC) ...
research
04/13/2020

Adversarial Augmentation Policy Search for Domain and Cross-Lingual Generalization in Reading Comprehension

Reading comprehension models often overfit to nuances of training datase...

Please sign up or login with your details

Forgot password? Click here to reset