Answering Complex Queries in Knowledge Graphs with Bidirectional Sequence Encoders

04/06/2020
by   Bhushan Kotnis, et al.
0

Representation learning for knowledge graphs (KGs) has focused on the problem of answering simple link prediction queries. In this work we address the more ambitious challenge of predicting the answers of conjunctive queries with multiple missing entities. We propose Bi-Directional Query Embedding (BiQE), a method that embeds conjunctive queries with models based on bi-directional attention mechanisms. Contrary to prior work, bidirectional self-attention can capture interactions among all the elements of a query graph. We introduce a new dataset for predicting the answer of conjunctive query and conduct experiments that show BiQE significantly outperforming state of the art baselines.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset