Co-Attention Hierarchical Network: Generating Coherent Long Distractors for Reading Comprehension

11/20/2019
by   Xiaorui Zhou, et al.
0

In reading comprehension, generating sentence-level distractors is a significant task, which requires a deep understanding of the article and question. The traditional entity-centered methods can only generate word-level or phrase-level distractors. Although recently proposed neural-based methods like sequence-to-sequence (Seq2Seq) model show great potential in generating creative text, the previous neural methods for distractor generation ignore two important aspects. First, they didn't model the interactions between the article and question, making the generated distractors tend to be too general or not relevant to question context. Second, they didn't emphasize the relationship between the distractor and article, making the generated distractors not semantically relevant to the article and thus fail to form a set of meaningful options. To solve the first problem, we propose a co-attention enhanced hierarchical architecture to better capture the interactions between the article and question, thus guide the decoder to generate more coherent distractors. To alleviate the second problem, we add an additional semantic similarity loss to push the generated distractors more relevant to the article. Experimental results show that our model outperforms several strong baselines on automatic metrics, achieving state-of-the-art performance. Further human evaluation indicates that our generated distractors are more coherent and more educative compared with those distractors generated by baselines.

READ FULL TEXT
research
09/08/2018

Generating Distractors for Reading Comprehension Questions from Real Examinations

We investigate the task of distractor generation for multiple choice rea...
research
03/07/2018

Automating Reading Comprehension by Generating Question and Answer Pairs

Neural network-based methods represent the state-of-the-art in question ...
research
04/29/2017

Learning to Ask: Neural Question Generation for Reading Comprehension

We study automatic question generation for sentences from text passages ...
research
12/10/2017

Contextualized Word Representations for Reading Comprehension

Reading a document and extracting an answer to a question about its cont...
research
08/19/2016

Who did What: A Large-Scale Person-Centered Cloze Dataset

We have constructed a new "Who-did-What" dataset of over 200,000 fill-in...
research
11/11/2018

ReDecode Framework for Iterative Improvement in Paraphrase Generation

Generating paraphrases, that is, different variations of a sentence conv...
research
10/08/2021

I Do Not Understand What I Cannot Define: Automatic Question Generation With Pedagogically-Driven Content Selection

Most learners fail to develop deep text comprehension when reading textb...

Please sign up or login with your details

Forgot password? Click here to reset