Roof-BERT: Divide Understanding Labour and Join in Work

12/13/2021
by   Wei-Lin Liao, et al.
0

Recent work on enhancing BERT-based language representation models with knowledge graphs (KGs) has promising results on multiple NLP tasks. State-of-the-art approaches typically integrate the original input sentences with triples in KGs, and feed the combined representation into a BERT model. However, as the sequence length of a BERT model is limited, the framework can not contain too much knowledge besides the original input sentences and is thus forced to discard some knowledge. The problem is especially severe for those downstream tasks that input is a long paragraph or even a document, such as QA or reading comprehension tasks. To address the problem, we propose Roof-BERT, a model with two underlying BERTs and a fusion layer on them. One of the underlying BERTs encodes the knowledge resources and the other one encodes the original input sentences, and the fusion layer like a roof integrates both BERTs' encodings. Experiment results on QA task reveal the effectiveness of the proposed model.

READ FULL TEXT
research
09/17/2019

K-BERT: Enabling Language Representation with Knowledge Graph

Pre-trained language representation models, such as BERT, capture a gene...
research
09/10/2021

Mixture-of-Partitions: Infusing Large Biomedical Knowledge Graphs into BERT

Infusing factual knowledge into pre-trained models is fundamental for ma...
research
10/18/2020

Towards Interpreting BERT for Reading Comprehension Based QA

BERT and its variants have achieved state-of-the-art performance in vari...
research
02/28/2020

DC-BERT: Decoupling Question and Document for Efficient Contextual Encoding

Recent studies on open-domain question answering have achieved prominent...
research
10/08/2022

KALM: Knowledge-Aware Integration of Local, Document, and Global Contexts for Long Document Understanding

With the advent of pre-trained language models (LMs), increasing researc...
research
04/24/2020

Contextualized Representations Using Textual Encyclopedic Knowledge

We present a method to represent input texts by contextualizing them joi...

Please sign up or login with your details

Forgot password? Click here to reset