Materialized Knowledge Bases from Commonsense Transformers

12/29/2021
by   Tuan-Phong Nguyen, et al.
0

Starting from the COMET methodology by Bosselut et al. (2019), generating commonsense knowledge directly from pre-trained language models has recently received significant attention. Surprisingly, up to now no materialized resource of commonsense knowledge generated this way is publicly available. This paper fills this gap, and uses the materialized resources to perform a detailed analysis of the potential of this approach in terms of precision and recall. Furthermore, we identify common problem cases, and outline use cases enabled by materialized resources. We posit that the availability of these resources is important for the advancement of the field, as it enables an off-the-shelf-use of the resulting knowledge, as well as further analyses on its strengths and weaknesses.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/12/2019

COMET: Commonsense Transformers for Automatic Knowledge Graph Construction

We present the first comprehensive study on automatic knowledge base con...
research
03/01/2018

Yuanfudao at SemEval-2018 Task 11: Three-way Attention and Relational Knowledge for Commonsense Machine Comprehension

This paper describes our system for SemEval-2018 Task 11: Machine Compre...
research
09/05/2019

Commonsense Reasoning Using WordNet and SUMO: a Detailed Analysis

We describe a detailed analysis of a sample of large benchmark of common...
research
04/18/2021

CoreQuisite: Circumstantial Preconditions of Common Sense Knowledge

The task of identifying and reasoning with circumstantial preconditions ...
research
05/01/2020

TransOMCS: From Linguistic Graphs to Commonsense Knowledge

Commonsense knowledge acquisition is a key problem for artificial intell...
research
10/30/2019

Towards Generalizable Neuro-Symbolic Systems for Commonsense Question Answering

Non-extractive commonsense QA remains a challenging AI task, as it requi...
research
10/10/2021

Language Models As or For Knowledge Bases

Pre-trained language models (LMs) have recently gained attention for the...

Please sign up or login with your details

Forgot password? Click here to reset