Pretrained Transformers for Simple Question Answering over Knowledge Graphs

01/31/2020
by   D. Lukovnikov, et al.
0

Answering simple questions over knowledge graphs is a well-studied problem in question answering. Previous approaches for this task built on recurrent and convolutional neural network based architectures that use pretrained word embeddings. It was recently shown that finetuning pretrained transformer networks (e.g. BERT) can outperform previous approaches on various natural language processing tasks. In this work, we investigate how well BERT performs on SimpleQuestions and provide an evaluation of both BERT and BiLSTM-based models in datasparse scenarios.

READ FULL TEXT

page 12

page 13

research
08/22/2021

UzBERT: pretraining a BERT model for Uzbek

Pretrained language models based on the Transformer architecture have ac...
research
07/22/2019

Introduction to Neural Network based Approaches for Question Answering over Knowledge Graphs

Question answering has emerged as an intuitive way of querying structure...
research
01/13/2019

Passage Re-ranking with BERT

Recently, neural models pretrained on a language modeling task, such as ...
research
10/06/2020

BERT Knows Punta Cana is not just beautiful, it's gorgeous: Ranking Scalar Adjectives with Contextualised Representations

Adjectives like pretty, beautiful and gorgeous describe positive propert...
research
12/05/2017

Strong Baselines for Simple Question Answering over Knowledge Graphs with and without Neural Networks

We examine the problem of question answering over knowledge graphs, focu...
research
06/02/2020

Question Answering on Scholarly Knowledge Graphs

Answering questions on scholarly knowledge comprising text and other art...
research
12/17/2021

ActKnow: Active External Knowledge Infusion Learning for Question Answering in Low Data Regime

Deep learning models have set benchmark results in various Natural Langu...

Please sign up or login with your details

Forgot password? Click here to reset