Hierarchies over Vector Space: Orienting Word and Graph Embeddings

11/02/2022
by   Xingzhi Guo, et al.
0

Word and graph embeddings are widely used in deep learning applications. We present a data structure that captures inherent hierarchical properties from an unordered flat embedding space, particularly a sense of direction between pairs of entities. Inspired by the notion of distributional generality, our algorithm constructs an arborescence (a directed rooted tree) by inserting nodes in descending order of entity power (e.g., word frequency), pointing each entity to the closest more powerful node as its parent. We evaluate the performance of the resulting tree structures on three tasks: hypernym relation discovery, least-common-ancestor (LCA) discovery among words, and Wikipedia page link recovery. We achieve average 8.98% and 2.70% for hypernym and LCA discovery across five languages and 62.76% accuracy on directed Wiki-page link recovery, with both substantially above baselines. Finally, we investigate the effect of insertion order, the power/similarity trade-off and various power sources to optimize parent selection.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/25/2019

Is a Single Vector Enough? Exploring Node Polysemy for Network Embedding

Networks have been widely used as the data structure for abstracting rea...
research
10/06/2020

Embedding Words in Non-Vector Space with Unsupervised Graph Learning

It has become a de-facto standard to represent words as elements of a ve...
research
04/17/2020

Exploring the Combination of Contextual Word Embeddings and Knowledge Graph Embeddings

“Classical” word embeddings, such as Word2Vec, have been shown to captur...
research
08/01/2019

An SPQR-Tree-Like Embedding Representation for Upward Planarity

The SPQR-tree is a data structure that compactly represents all planar e...
research
07/16/2019

DeepTrax: Embedding Graphs of Financial Transactions

Financial transactions can be considered edges in a heterogeneous graph ...
research
04/22/2020

Preserving the Hypernym Tree of WordNet in Dense Embeddings

In this paper, we provide a novel way to generate low-dimension (dense) ...

Please sign up or login with your details

Forgot password? Click here to reset