Commonsense Knowledge Reasoning and Generation with Pre-trained Language Models: A Survey

01/28/2022
by   Prajjwal Bhargava, et al.
0

While commonsense knowledge acquisition and reasoning has traditionally been a core research topic in the knowledge representation and reasoning community, recent years have seen a surge of interest in the natural language processing community in developing pre-trained models and testing their ability to address a variety of newly designed commonsense knowledge reasoning and generation tasks. This paper presents a survey of these tasks, discusses the strengths and weaknesses of state-of-the-art pre-trained models for commonsense reasoning and generation as revealed by these tasks, and reflects on future research directions.

READ FULL TEXT

page 3

page 5

research
04/02/2019

Commonsense Reasoning for Natural Language Understanding: A Survey of Benchmarks, Resources, and Approaches

Commonsense knowledge and commonsense reasoning are some of the main bot...
research
10/13/2022

Language Models of Code are Few-Shot Commonsense Learners

We address the general task of structured commonsense reasoning: given a...
research
02/02/2023

Commonsense-Aware Prompting for Controllable Empathetic Dialogue Generation

Improving the emotional awareness of pre-trained language models is an e...
research
10/22/2022

DiscoSense: Commonsense Reasoning with Discourse Connectives

We present DiscoSense, a benchmark for commonsense reasoning via underst...
research
02/15/2023

Commonsense Reasoning for Conversational AI: A Survey of the State of the Art

Large, transformer-based pretrained language models like BERT, GPT, and ...
research
10/14/2022

MiQA: A Benchmark for Inference on Metaphorical Questions

We propose a benchmark to assess the capability of large language models...
research
05/26/2022

Revisiting Generative Commonsense Reasoning: A Pre-Ordering Approach

Pre-trained models (PTMs) have lead to great improvements in natural lan...

Please sign up or login with your details

Forgot password? Click here to reset