Analysis of Word Embeddings using Fuzzy Clustering

07/17/2019
by   Shahin Atakishiyev, et al.
0

In data dominated systems and applications, a concept of representing words in a numerical format has gained a lot of attention. There are a few approaches used to generate such a representation. An interesting issue that should be considered is the ability of such representations - called embeddings - to imitate human-based semantic similarity between words. In this study, we perform a fuzzy-based analysis of vector representations of words, i.e., word embeddings. We use two popular fuzzy clustering algorithms on count-based word embeddings, known as GloVe, of different dimensionality. Words from WordSim-353, called the gold standard, are represented as vectors and clustered. The results indicate that fuzzy clustering algorithms are very sensitive to high-dimensional data, and parameter tuning can dramatically change their performance. We show that by adjusting the value of the fuzzifier parameter, fuzzy clustering can be successfully applied to vectors of high - up to one hundred - dimensions. Additionally, we illustrate that fuzzy clustering allows to provide interesting results regarding membership of words to different clusters.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/18/2019

Estimator Vectors: OOV Word Embeddings based on Subword and Context Clue Estimates

Semantic representations of words have been successfully extracted from ...
research
02/08/2019

Humor in Word Embeddings: Cockamamie Gobbledegook for Nincompoops

We study humor in Word Embeddings, a popular AI tool that associates eac...
research
06/06/2019

Derivational Morphological Relations in Word Embeddings

Derivation is a type of a word-formation process which creates new words...
research
09/07/2015

Fuzzy Jets

Collimated streams of particles produced in high energy physics experime...
research
04/30/2019

Don't Settle for Average, Go for the Max: Fuzzy Sets and Max-Pooled Word Vectors

Recent literature suggests that averaged word vectors followed by simple...
research
12/16/2017

Taming Wild High Dimensional Text Data with a Fuzzy Lash

The bag of words (BOW) represents a corpus in a matrix whose elements ar...
research
08/02/2016

Exponential Family Embeddings

Word embeddings are a powerful approach for capturing semantic similarit...

Please sign up or login with your details

Forgot password? Click here to reset