Language Generation with Multi-Hop Reasoning on Commonsense Knowledge Graph

09/24/2020
by   Haozhe Ji, et al.
0

Despite the success of generative pre-trained language models on a series of text generation tasks, they still suffer in cases where reasoning over underlying commonsense knowledge is required during generation. Existing approaches that integrate commonsense knowledge into generative pre-trained language models simply transfer relational knowledge by post-training on individual knowledge triples while ignoring rich connections within the knowledge graph. We argue that exploiting both the structural and semantic information of the knowledge graph facilitates commonsense-aware text generation. In this paper, we propose Generation with Multi-Hop Reasoning Flow (GRF) that enables pre-trained models with dynamic multi-hop reasoning on multi-relational paths extracted from the external commonsense knowledge graph. We empirically show that our model outperforms existing baselines on three text generation tasks that require reasoning over commonsense knowledge. We also demonstrate the effectiveness of the dynamic multi-hop reasoning module with reasoning paths inferred by the model that provide rationale to the generation.

READ FULL TEXT

page 1

page 2

page 3

page 4

09/26/2020

KG-BART: Knowledge Graph-Augmented BART for Generative Commonsense Reasoning

Generative commonsense reasoning which aims to empower machines to gener...
09/28/2020

Graph-based Multi-hop Reasoning for Long Text Generation

Long text generation is an important but challenging task.The main probl...
08/13/2020

Commonsense Knowledge Graph Reasoning by Selection or Generation? Why?

Commonsense knowledge graph reasoning(CKGR) is the task of predicting a ...
06/12/2022

CoSe-Co: Text Conditioned Generative CommonSense Contextualizer

Pre-trained Language Models (PTLMs) have been shown to perform well on n...
04/01/2019

Ranking and Selecting Multi-Hop Knowledge Paths to Better Predict Human Needs

To make machines better understand sentiments, research needs to move fr...
05/26/2022

Revisiting Generative Commonsense Reasoning: A Pre-Ordering Approach

Pre-trained models (PTMs) have lead to great improvements in natural lan...