Preserving the Hypernym Tree of WordNet in Dense Embeddings

04/22/2020
by   Canlin Zhang, et al.
0

In this paper, we provide a novel way to generate low-dimension (dense) vector embeddings for the noun and verb synsets in WordNet, so that the hypernym-hyponym tree structure is preserved in the embeddings. We call this embedding the sense spectrum (and sense spectra for embeddings). In order to create suitable labels for the training of sense spectra, we designed a new similarity measurement for noun and verb synsets in WordNet. We call this similarity measurement the hypernym intersection similarity (HIS), since it compares the common and unique hypernyms between two synsets. Our experiments show that on the noun and verb pairs of the SimLex-999 dataset, HIS outperforms the three similarity measurements in WordNet. Moreover, to the best of our knowledge, the sense spectra is the first dense embedding system that can explicitly and completely measure the hypernym-hyponym relationship in WordNet.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/24/2018

On k-abelian Equivalence and Generalized Lagrange Spectra

We study the set of k-abelian critical exponents of all Sturmian words. ...
research
04/22/2023

Semantic Specialization for Knowledge-based Word Sense Disambiguation

A promising approach for knowledge-based Word Sense Disambiguation (WSD)...
research
04/22/2018

Inducing and Embedding Senses with Scaled Gumbel Softmax

Methods for learning word sense embeddings represent a single word with ...
research
07/14/2017

A Semantics-Based Measure of Emoji Similarity

Emoji have grown to become one of the most important forms of communicat...
research
02/26/2019

Improving a tf-idf weighted document vector embedding

We examine a number of methods to compute a dense vector embedding for a...
research
03/02/2021

Signal recovery from a few linear measurements of its high-order spectra

The q-th order spectrum is a polynomial of degree q in the entries of a ...
research
11/02/2022

Hierarchies over Vector Space: Orienting Word and Graph Embeddings

Word and graph embeddings are widely used in deep learning applications....

Please sign up or login with your details

Forgot password? Click here to reset