Why PairDiff works? -- A Mathematical Analysis of Bilinear Relational Compositional Operators for Analogy Detection

09/19/2017
by   Huda Hakami, et al.
0

Representing the semantic relations that exist between two given words (or entities) is an important first step in a wide-range of NLP applications such as analogical reasoning, knowledge base completion and relational information retrieval. A simple, yet surprisingly accurate method for representing a relation between two words is to compute the vector offset () between their corresponding word embeddings. Despite the empirical success, it remains unclear as to whether is the best operator for obtaining a relational representation from word embeddings. We conduct a theoretical analysis of generalised bilinear operators that can be used to measure the ℓ_2 relational distance between two word-pairs. We show that, if the word embeddings are standardised and uncorrelated, such an operator will be independent of bilinear terms, and can be simplified to a linear form, where is a special case. For numerous word embedding types, we empirically verify the uncorrelation assumption, demonstrating the general applicability of our theoretical result. Moreover, we experimentally discover from the bilinear relation composition operator on several benchmark analogy datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/19/2017

An Optimality Proof for the PairDiff operator for Representing Relations between Words

Representing the semantic relations that exist between two given words (...
research
09/04/2017

Compositional Approaches for Representing Relations Between Words: A Comparative Study

Identifying the relations that exist between words (or entities) is impo...
research
06/04/2019

Relational Word Embeddings

While word embeddings have been shown to implicitly encode various forms...
research
08/21/2017

Probabilistic Relation Induction in Vector Space Embeddings

Word embeddings have been found to capture a surprisingly rich amount of...
research
01/09/2020

Multiplex Word Embeddings for Selectional Preference Acquisition

Conventional word embeddings represent words with fixed vectors, which a...
research
11/28/2019

Inducing Relational Knowledge from BERT

One of the most remarkable properties of word embeddings is the fact tha...
research
04/24/2015

Compositional Vector Space Models for Knowledge Base Completion

Knowledge base (KB) completion adds new facts to a KB by making inferenc...

Please sign up or login with your details

Forgot password? Click here to reset