Word Sense Induction with Neural biLM and Symmetric Patterns

08/26/2018
by   Asaf Amrami, et al.
0

An established method for Word Sense Induction (WSI) uses a language model to predict probable substitutes for target words, and induces senses by clustering these resulting substitute vectors. We replace the ngram-based language model(LM) with a recurrent one. Beyond being more accurate, the use of the recurrent LM allows us to effectively query it in a creative way, using what we call dynamic symmetric patterns. The combination of the RNN-LM and the dynamic symmetric patterns results in strong substitute vectors for WSI, allowing to surpass the current state-of-the-art on the SemEval 2013 WSI shared task by a large margin.

READ FULL TEXT
research
06/23/2020

Combining Neural Language Models for WordSense Induction

Word sense induction (WSI) is the problem of grouping occurrences of an ...
research
05/06/2018

Russian word sense induction by clustering averaged word embeddings

The paper reports our participation in the shared task on word sense ind...
research
05/29/2019

Towards better substitution-based word sense induction

Word sense induction (WSI) is the task of unsupervised clustering of wor...
research
02/28/2013

KSU KDD: Word Sense Induction by Clustering in Topic Space

We describe our language-independent unsupervised word sense induction s...
research
10/11/2022

Word Sense Induction with Hierarchical Clustering and Mutual Information Maximization

Word sense induction (WSI) is a difficult problem in natural language pr...
research
11/22/2018

AutoSense Model for Word Sense Induction

Word sense induction (WSI), or the task of automatically discovering mul...
research
09/28/2022

RuDSI: graph-based word sense induction dataset for Russian

We present RuDSI, a new benchmark for word sense induction (WSI) in Russ...

Please sign up or login with your details

Forgot password? Click here to reset