A First Experiment on Including Text Literals in KGloVe

07/31/2018
by   Michael Cochez, et al.
0

Graph embedding models produce embedding vectors for entities and relations in Knowledge Graphs, often without taking literal properties into account. We show an initial idea based on the combination of global graph structure with additional information provided by textual information in properties. Our initial experiment shows that this approach might be useful, but does not clearly outperform earlier approaches when evaluated on machine learning tasks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/26/2016

Knowledge Graph Representation with Jointly Structural and Textual Encoding

The objective of knowledge graph embedding is to encode both entities an...
research
09/06/2023

Universal Preprocessing Operators for Embedding Knowledge Graphs with Literals

Knowledge graph embeddings are dense numerical representations of entiti...
research
09/23/2020

Structure Aware Negative Sampling in Knowledge Graphs

Learning low-dimensional representations for entities and relations in k...
research
09/22/2021

Updating Embeddings for Dynamic Knowledge Graphs

Data in Knowledge Graphs often represents part of the current state of t...
research
11/14/2022

Heterogeneous Graph Sparsification for Efficient Representation Learning

Graph sparsification is a powerful tool to approximate an arbitrary grap...
research
07/15/2019

A Neural Turing Machine for Conditional Transition Graph Modeling

Graphs are an essential part of many machine learning problems such as a...
research
06/22/2023

Predictive Patentomics: Forecasting Innovation Success and Valuation with ChatGPT

Analysis of innovation has been fundamentally limited by conventional ap...

Please sign up or login with your details

Forgot password? Click here to reset