Diversifying Content Generation for Commonsense Reasoning with Mixture of Knowledge Graph Experts

03/14/2022
by   Wenhao Yu, et al.
0

Generative commonsense reasoning (GCR) in natural language is to reason about the commonsense while generating coherent text. Recent years have seen a surge of interest in improving the generation quality of commonsense reasoning tasks. Nevertheless, these approaches have seldom investigated diversity in the GCR tasks, which aims to generate alternative explanations for a real-world situation or predict all possible outcomes. Diversifying GCR is challenging as it expects to generate multiple outputs that are not only semantically different but also grounded in commonsense knowledge. In this paper, we propose MoKGE, a novel method that diversifies the generative reasoning by a mixture of expert (MoE) strategy on commonsense knowledge graphs (KG). A set of knowledge experts seek diverse reasoning on KG to encourage various generation outputs. Empirical experiments demonstrated that MoKGE can significantly improve the diversity while achieving on par performance on accuracy on two GCR benchmarks, based on both automatic and human evaluations.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/30/2021

Automatic Knowledge Augmentation for Generative Commonsense Reasoning

Generative commonsense reasoning is the capability of a language model t...
research
07/28/2023

Reasoning before Responding: Integrating Commonsense-based Causality Explanation for Empathetic Response Generation

Recent approaches to empathetic response generation try to incorporate c...
research
10/15/2022

A User Interface for Sense-making of the Reasoning Process while Interacting with Robots

As knowledge graph has the potential to bridge the gap between commonsen...
research
04/15/2021

ExplaGraphs: An Explanation Graph Generation Task for Structured Commonsense Reasoning

Recent commonsense-reasoning tasks are typically discriminative in natur...
research
12/20/2022

DimonGen: Diversified Generative Commonsense Reasoning for Explaining Concept Relationships

In this paper, we propose DimonGen, which aims to generate diverse sente...
research
05/30/2023

Less Likely Brainstorming: Using Language Models to Generate Alternative Hypotheses

A human decision-maker benefits the most from an AI assistant that corre...
research
06/12/2022

CoSe-Co: Text Conditioned Generative CommonSense Contextualizer

Pre-trained Language Models (PTLMs) have been shown to perform well on n...

Please sign up or login with your details

Forgot password? Click here to reset