Inducing and Embedding Senses with Scaled Gumbel Softmax

04/22/2018
by   Fenfei Guo, et al.
0

Methods for learning word sense embeddings represent a single word with multiple sense-specific vectors. These methods should not only produce interpretable sense embeddings, but should also learn how to select which sense to use in a given context. We propose an unsupervised model that learns sense embeddings using a modified Gumbel softmax function, which allows for differentiable discrete sense selection. Our model produces sense embeddings that are competitive (and sometimes state of the art) on multiple similarity based downstream evaluations. However, performance on these downstream evaluations tasks does not correlate with interpretability of sense embeddings, as we discover through an interpretability comparison with competing multi-sense embeddings. While many previous approaches perform well on downstream evaluations, they do not produce interpretable embeddings and learn duplicated sense groups; our method achieves the best of both worlds.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/10/2017

Making Sense of Word Embeddings

We present a simple yet effective approach for learning word sense embed...
research
07/21/2017

Unsupervised, Knowledge-Free, and Interpretable Word Sense Disambiguation

Interpretability of a predictive model is a powerful feature that gains ...
research
07/06/2017

A Simple Approach to Learn Polysemous Word Embeddings

Many NLP applications require disambiguating polysemous words. Existing ...
research
04/22/2020

Preserving the Hypernym Tree of WordNet in Dense Embeddings

In this paper, we provide a novel way to generate low-dimension (dense) ...
research
04/09/2018

Efficient Graph-based Word Sense Induction by Distributional Inclusion Vector Embeddings

Word sense induction (WSI), which addresses polysemy by unsupervised dis...
research
04/02/2019

Using Multi-Sense Vector Embeddings for Reverse Dictionaries

Popular word embedding methods such as word2vec and GloVe assign a singl...
research
04/15/2017

MUSE: Modularizing Unsupervised Sense Embeddings

This paper proposes to address the word sense ambiguity issue in an unsu...

Please sign up or login with your details

Forgot password? Click here to reset