Towards a Theoretical Understanding of Word and Relation Representation

02/01/2022
by   Carl Allen, et al.
0

Representing words by vectors, or embeddings, enables computational reasoning and is foundational to automating natural language tasks. For example, if word embeddings of similar words contain similar values, word similarity can be readily assessed, whereas judging that from their spelling is often impossible (e.g. cat /feline) and to predetermine and store similarities between all words is prohibitively time-consuming, memory intensive and subjective. We focus on word embeddings learned from text corpora and knowledge graphs. Several well-known algorithms learn word embeddings from text on an unsupervised basis by learning to predict those words that occur around each word, e.g. word2vec and GloVe. Parameters of such word embeddings are known to reflect word co-occurrence statistics, but how they capture semantic meaning has been unclear. Knowledge graph representation models learn representations both of entities (words, people, places, etc.) and relations between them, typically by training a model to predict known facts in a supervised manner. Despite steady improvements in fact prediction accuracy, little is understood of the latent structure that enables this. The limited understanding of how latent semantic structure is encoded in the geometry of word embeddings and knowledge graph representations makes a principled means of improving their performance, reliability or interpretability unclear. To address this: 1. we theoretically justify the empirical observation that particular geometric relationships between word embeddings learned by algorithms such as word2vec and GloVe correspond to semantic relations between words; and 2. we extend this correspondence between semantics and geometry to the entities and relations of knowledge graphs, providing a model for the latent structure of knowledge graph representation linked to that of word embeddings.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/09/2019

Composing Knowledge Graph Embeddings via Word Embeddings

Learning knowledge graph embedding from an existing knowledge graph is v...
research
09/25/2019

On Understanding Knowledge Graph Representation

Many methods have been developed to represent knowledge graph data, whic...
research
02/05/2018

Semantic projection: recovering human knowledge of multiple, distinct object features from word embeddings

The words of a language reflect the structure of the human mind, allowin...
research
06/12/2019

Representation Learning for Words and Entities

This thesis presents new methods for unsupervised learning of distribute...
research
11/22/2015

On the Linear Algebraic Structure of Distributed Word Representations

In this work, we leverage the linear algebraic structure of distributed ...
research
04/17/2020

Exploring the Combination of Contextual Word Embeddings and Knowledge Graph Embeddings

“Classical” word embeddings, such as Word2Vec, have been shown to captur...
research
12/12/2016

ConceptNet 5.5: An Open Multilingual Graph of General Knowledge

Machine learning about language can be improved by supplying it with spe...

Please sign up or login with your details

Forgot password? Click here to reset