Subword-augmented Embedding for Cloze Reading Comprehension

06/24/2018
by   Zhuosheng Zhang, et al.
0

Representation learning is the foundation of machine reading comprehension. In state-of-the-art models, deep learning methods broadly use word and character level representations. However, character is not naturally the minimal linguistic unit. In addition, with a simple concatenation of character and word embedding, previous models actually give suboptimal solution. In this paper, we propose to use subword rather than character for word embedding enhancement. We also empirically explore different augmentation strategies on subword-augmented embedding to enhance the cloze-style reading comprehension model reader. In detail, we present a reader that uses subword-level representation to augment word embedding with a short list to handle rare words effectively. A thorough examination is conducted to evaluate the comprehensive performance and generalization ability of the proposed reader. Experimental results show that the proposed approach helps the reader significantly outperform the state-of-the-art baselines on various public datasets.

READ FULL TEXT
research
08/07/2018

Effective Character-augmented Word Embedding for Machine Reading Comprehension

Machine reading comprehension is a task to model relationship between pa...
research
11/06/2018

Effective Subword Segmentation for Text Comprehension

Character-level representations have been broadly adopted to alleviate t...
research
11/06/2016

Words or Characters? Fine-grained Gating for Reading Comprehension

Previous work combines word-level and character-level representations us...
research
12/11/2018

Delta Embedding Learning

Learning from corpus and learning from supervised NLP tasks both give us...
research
11/02/2019

Design and Challenges of Cloze-Style Reading Comprehension Tasks on Multiparty Dialogue

This paper analyzes challenges in cloze-style reading comprehension on m...
research
03/02/2017

A Comparative Study of Word Embeddings for Reading Comprehension

The focus of past machine learning research for Reading Comprehension ta...
research
05/05/2017

Sequential Attention: A Context-Aware Alignment Function for Machine Reading

In this paper we propose a neural network model with a novel Sequential ...

Please sign up or login with your details

Forgot password? Click here to reset