GreaseLM: Graph REASoning Enhanced Language Models for Question Answering

01/21/2022
by   Xikun Zhang, et al.
1

Answering complex questions about textual narratives requires reasoning over both stated context and the world knowledge that underlies it. However, pretrained language models (LM), the foundation of most modern QA systems, do not robustly represent latent relationships between concepts, which is necessary for reasoning. While knowledge graphs (KG) are often used to augment LMs with structured representations of world knowledge, it remains an open question how to effectively fuse and reason over the KG representations and the language context, which provides situational constraints and nuances. In this work, we propose GreaseLM, a new model that fuses encoded representations from pretrained LMs and graph neural networks over multiple layers of modality interaction operations. Information from both modalities propagates to the other, allowing language context representations to be grounded by structured world knowledge, and allowing linguistic nuances (e.g., negation, hedging) in the context to inform the graph representations of knowledge. Our results on three benchmarks in the commonsense reasoning (i.e., CommonsenseQA, OpenbookQA) and medical question answering (i.e., MedQA-USMLE) domains demonstrate that GreaseLM can more reliably answer questions that require reasoning over both situational constraints and structured knowledge, even outperforming models 8x larger.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/13/2021

QA-GNN: Reasoning with Language Models and Knowledge Graphs for Question Answering

The problem of answering questions using knowledge from pre-trained lang...
research
05/30/2023

Graph Reasoning for Question Answering with Triplet Retrieval

Answering complex questions often requires reasoning over knowledge grap...
research
05/17/2023

Can Language Models Solve Graph Problems in Natural Language?

Large language models (LLMs) are increasingly adopted for a variety of t...
research
09/17/2022

Flexible and Structured Knowledge Grounded Question Answering

Can language models (LM) ground question-answering (QA) tasks in the kno...
research
05/10/2023

RECKONING: Reasoning through Dynamic Knowledge Encoding

Recent studies on transformer-based language models show that they can a...
research
06/01/2021

Parameter-Efficient Neural Question Answering Models via Graph-Enriched Document Representations

As the computational footprint of modern NLP systems grows, it becomes i...
research
12/06/2021

JointLK: Joint Reasoning with Language Models and Knowledge Graphs for Commonsense Question Answering

Existing KG-augmented models for question answering primarily focus on d...

Please sign up or login with your details

Forgot password? Click here to reset