Using Multi-Sense Vector Embeddings for Reverse Dictionaries

04/02/2019
by   Michael A. Hedderich, et al.
0

Popular word embedding methods such as word2vec and GloVe assign a single vector representation to each word, even if a word has multiple distinct meanings. Multi-sense embeddings instead provide different vectors for each sense of a word. However, they typically cannot serve as a drop-in replacement for conventional single-sense embeddings, because the correct sense vector needs to be selected for each word. In this work, we study the effect of multi-sense embeddings on the task of reverse dictionaries. We propose a technique to easily integrate them into an existing neural network architecture using an attention mechanism. Our experiments demonstrate that large improvements can be obtained when employing multi-sense embeddings both in the input sequence as well as for the target representation. An analysis of the sense distributions and of the learned attention is provided as well.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/29/2022

Chinese Word Sense Embedding with SememeWSD and Synonym Set

Word embedding is a fundamental natural language processing task which c...
research
02/08/2019

Humor in Word Embeddings: Cockamamie Gobbledegook for Nincompoops

We study humor in Word Embeddings, a popular AI tool that associates eac...
research
02/22/2017

One Representation per Word - Does it make Sense for Composition?

In this paper, we investigate whether an a priori disambiguation of word...
research
07/30/2019

SenseFitting: Sense Level Semantic Specialization of Word Embeddings for Word Sense Disambiguation

We introduce a neural network-based system of Word Sense Disambiguation ...
research
04/22/2018

Inducing and Embedding Senses with Scaled Gumbel Softmax

Methods for learning word sense embeddings represent a single word with ...
research
09/04/2018

A Novel Neural Sequence Model with Multiple Attentions for Word Sense Disambiguation

Word sense disambiguation (WSD) is a well researched problem in computat...
research
06/24/2019

LIAAD at SemDeep-5 Challenge: Word-in-Context (WiC)

This paper describes the LIAAD system that was ranked second place in th...

Please sign up or login with your details

Forgot password? Click here to reset