Commonsense Knowledge Mining from Pretrained Models

09/02/2019
by   Joshua Feldman, et al.
0

Inferring commonsense knowledge is a key challenge in natural language processing, but due to the sparsity of training data, previous work has shown that supervised methods for commonsense knowledge mining underperform when evaluated on novel data. In this work, we develop a method for generating commonsense knowledge using a large, pre-trained bidirectional language model. By transforming relational triples into masked sentences, we can use this model to rank a triple's validity by the estimated pointwise mutual information between the two entities. Since we do not update the weights of the bidirectional model, our approach is not biased by the coverage of any one commonsense knowledge base. Though this method performs worse on a test set than models explicitly trained on a corresponding training set, it outperforms these methods when mining commonsense knowledge from new sources, suggesting that unsupervised techniques may generalize better than current supervised approaches.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/24/2018

Commonsense mining as knowledge base completion? A study on the impact of novelty

Commonsense knowledge bases such as ConceptNet represent knowledge in th...
research
05/28/2021

Alleviating the Knowledge-Language Inconsistency: A Study for Deep Commonsense Knowledge

Knowledge facts are typically represented by relational triples, while w...
research
02/01/2021

Commonsense Knowledge Mining from Term Definitions

Commonsense knowledge has proven to be beneficial to a variety of applic...
research
05/24/2022

GeoMLAMA: Geo-Diverse Commonsense Probing on Multilingual Pre-Trained Language Models

Recent work has shown that Pre-trained Language Models (PLMs) have the a...
research
06/12/2022

CoSe-Co: Text Conditioned Generative CommonSense Contextualizer

Pre-trained Language Models (PTLMs) have been shown to perform well on n...
research
10/10/2022

Do Children Texts Hold The Key To Commonsense Knowledge?

Compiling comprehensive repositories of commonsense knowledge is a long-...
research
10/27/2020

DualTKB: A Dual Learning Bridge between Text and Knowledge Base

In this work, we present a dual learning approach for unsupervised text ...

Please sign up or login with your details

Forgot password? Click here to reset