Multiplex Word Embeddings for Selectional Preference Acquisition

01/09/2020
by   Hongming Zhang, et al.
0

Conventional word embeddings represent words with fixed vectors, which are usually trained based on co-occurrence patterns among words. In doing so, however, the power of such representations is limited, where the same word might be functionalized separately under different syntactic relations. To address this limitation, one solution is to incorporate relational dependencies of different words into their embeddings. Therefore, in this paper, we propose a multiplex word embedding model, which can be easily extended according to various relations among words. As a result, each word has a center embedding to represent its overall semantics, and several relational embeddings to represent its relational dependencies. Compared to existing models, our model can effectively distinguish words with respect to different relations without introducing unnecessary sparseness. Moreover, to accommodate various relations, we use a small dimension for relational embeddings and our model is able to keep their effectiveness. Experiments on selectional preference acquisition and word similarity demonstrate the effectiveness of the proposed model, and a further study of scalability also proves that our embeddings only need 1/20 of the original embedding size to achieve better performance.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/04/2019

Relational Word Embeddings

While word embeddings have been shown to implicitly encode various forms...
research
02/02/2019

Understanding Composition of Word Embeddings via Tensor Decomposition

Word embedding is a powerful tool in natural language processing. In thi...
research
08/21/2017

Probabilistic Relation Induction in Vector Space Embeddings

Word embeddings have been found to capture a surprisingly rich amount of...
research
11/25/2017

Experiential, Distributional and Dependency-based Word Embeddings have Complementary Roles in Decoding Brain Activity

We evaluate 8 different word embedding models on their usefulness for pr...
research
09/19/2017

An Optimality Proof for the PairDiff operator for Representing Relations between Words

Representing the semantic relations that exist between two given words (...
research
10/09/2018

Unsupervised Object Matching for Relational Data

We propose an unsupervised object matching method for relational data, w...
research
09/19/2017

Why PairDiff works? -- A Mathematical Analysis of Bilinear Relational Compositional Operators for Analogy Detection

Representing the semantic relations that exist between two given words (...

Please sign up or login with your details

Forgot password? Click here to reset