MUSE: Modularizing Unsupervised Sense Embeddings

04/15/2017
by   Guang-He Lee, et al.
0

This paper proposes to address the word sense ambiguity issue in an unsupervised manner, where word sense representations are learned along a word sense selection mechanism given contexts. Prior work about learning multi-sense embeddings suffered from either ambiguity of different-level embeddings or inefficient sense selection. The proposed modular framework, MUSE, implements flexible modules to optimize distinct mechanisms, achieving the first purely sense-level representation learning system with linear-time sense selection. We leverage reinforcement learning to enable joint training on the proposed modules, and introduce various exploration techniques on sense selection for better robustness. The experiments on benchmark data show that the proposed approach achieves the state-of-the-art performance on synonym selection as well as on contextual word similarities in terms of MaxSimC.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/05/2023

Context-Aware Semantic Similarity Measurement for Unsupervised Word Sense Disambiguation

The issue of word sense ambiguity poses a significant challenge in natur...
research
06/24/2019

Language Modelling Makes Sense: Propagating Representations through WordNet for Full-Coverage Word Sense Disambiguation

Contextual embeddings represent a new generation of semantic representat...
research
05/28/2019

Automatic Ambiguity Detection

Most work on sense disambiguation presumes that one knows beforehand -- ...
research
07/06/2017

A Simple Approach to Learn Polysemous Word Embeddings

Many NLP applications require disambiguating polysemous words. Existing ...
research
04/22/2018

Inducing and Embedding Senses with Scaled Gumbel Softmax

Methods for learning word sense embeddings represent a single word with ...
research
08/22/2000

Explaining away ambiguity: Learning verb selectional preference with Bayesian networks

This paper presents a Bayesian model for unsupervised learning of verb s...
research
10/10/2020

Automated Concatenation of Embeddings for Structured Prediction

Pretrained contextualized embeddings are powerful word representations f...

Please sign up or login with your details

Forgot password? Click here to reset