Machine Reading Comprehension: The Role of Contextualized Language Models and Beyond

05/13/2020
by   Zhuosheng Zhang, et al.
6

Machine reading comprehension (MRC) aims to teach machines to read and comprehend human languages, which is a long-standing goal of natural language processing (NLP). With the burst of deep neural networks and the evolution of contextualized language models (CLMs), the research of MRC has experienced two significant breakthroughs. MRC and CLM, as a phenomenon, have a great impact on the NLP community. In this survey, we provide a comprehensive and comparative review on MRC covering overall research topics about 1) the origin and development of MRC and CLM, with a particular focus on the role of CLMs; 2) the impact of MRC and CLM to the NLP community; 3) the definition, datasets, and evaluation of MRC; 4) general MRC architecture and technical methods in the view of two-stage Encoder-Decoder solving architecture from the insights of the cognitive process of humans; 5) previous highlights, emerging topics, and our empirical analysis, among which we especially focus on what works in different periods of MRC researches. We propose a full-view categorization and new taxonomies on these topics. The primary views we have arrived at are that 1) MRC boosts the progress from language processing to understanding; 2) the rapid improvement of MRC systems greatly benefits from the development of CLMs; 3) the theme of MRC is gradually moving from shallow text matching to cognitive reasoning.

READ FULL TEXT

page 6

page 9

page 13

page 18

page 22

page 23

page 28

page 32

08/20/2020

An Experimental Study of Deep Neural Network Models for Vietnamese Multiple-Choice Reading Comprehension

Machine reading comprehension (MRC) is a challenging task in natural lan...
10/10/2018

New Vistas to study Bhartrhari: Cognitive NLP

The Sanskrit grammatical tradition which has commenced with Panini's Ast...
03/04/2018

CAESAR: Context Awareness Enabled Summary-Attentive Reader

Comprehending meaning from natural language is a primary objective of Na...
01/04/2021

Retrieving and Reading: A Comprehensive Survey on Open-domain Question Answering

Open-domain Question Answering (OpenQA) is an important task in Natural ...
03/04/2021

Advances in Multi-turn Dialogue Comprehension: A Survey

Training machines to understand natural language and interact with human...
04/27/2020

Natural language processing for achieving sustainable development: the case of neural labelling to enhance community profiling

In recent years, there has been an increasing interest in the applicatio...
07/12/2018

Recurrent Neural Networks in Linguistic Theory: Revisiting Pinker and Prince (1988) and the Past Tense Debate

Can advances in NLP help advance cognitive modeling? We examine the role...