DeepAI
Log In Sign Up

Why Do Neural Language Models Still Need Commonsense Knowledge to Handle Semantic Variations in Question Answering?

09/01/2022
by   Sunjae Kwon, et al.
10

Many contextualized word representations are now learned by intricate neural network models, such as masked neural language models (MNLMs) which are made up of huge neural network structures and trained to restore the masked text. Such representations demonstrate superhuman performance in some reading comprehension (RC) tasks which extract a proper answer in the context given a question. However, identifying the detailed knowledge trained in MNLMs is challenging owing to numerous and intermingled model parameters. This paper provides new insights and empirical analyses on commonsense knowledge included in pretrained MNLMs. First, we use a diagnostic test that evaluates whether commonsense knowledge is properly trained in MNLMs. We observe that a large proportion of commonsense knowledge is not appropriately trained in MNLMs and MNLMs do not often understand the semantic meaning of relations accurately. In addition, we find that the MNLM-based RC models are still vulnerable to semantic variations that require commonsense knowledge. Finally, we discover the fundamental reason why some knowledge is not trained. We further suggest that utilizing an external commonsense knowledge repository can be an effective solution. We exemplify the possibility to overcome the limitations of the MNLM-based RC models by enriching text with the required knowledge from an external commonsense knowledge repository in controlled experiments.

READ FULL TEXT

page 1

page 5

page 9

page 10

page 29

page 35

page 36

page 37

11/08/2019

Why Do Masked Neural Language Models Still Need Common Sense Knowledge?

Currently, contextualized word representations are learned by intricate ...
11/27/2019

Evaluating Commonsense in Pre-trained Language Models

Contextualized representations trained over large raw text data have giv...
09/05/2018

Improving Question Answering by Commonsense-Based Pre-Training

Although neural network approaches achieve remarkable success on a varie...
01/04/2021

Benchmarking Knowledge-Enhanced Commonsense Question Answering via Knowledge-to-Text Transformation

A fundamental ability of humans is to utilize commonsense knowledge in l...
05/21/2018

Knowledgeable Reader: Enhancing Cloze-Style Reading Comprehension with External Commonsense Knowledge

We introduce a neural reading comprehension model that integrates extern...
02/01/2021

Commonsense Knowledge Mining from Term Definitions

Commonsense knowledge has proven to be beneficial to a variety of applic...
07/04/2021

Coarse-to-Careful: Seeking Semantic-related Knowledge for Open-domain Commonsense Question Answering

It is prevalent to utilize external knowledge to help machine answer que...