Word Embeddings Are Capable of Capturing Rhythmic Similarity of Words

04/11/2022
by   Hosein Rezaei, et al.
0

Word embedding systems such as Word2Vec and GloVe are well-known in deep learning approaches to NLP. This is largely due to their ability to capture semantic relationships between words. In this work we investigated their usefulness in capturing rhythmic similarity of words instead. The results show that vectors these embeddings assign to rhyming words are more similar to each other, compared to the other words. It is also revealed that GloVe performs relatively better than Word2Vec in this regard. We also proposed a first of its kind metric for quantifying rhythmic similarity of a pair of words.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/17/2019

Multi Sense Embeddings from Topic Models

Distributed word embeddings have yielded state-of-the-art performance in...
research
03/24/2018

Equation Embeddings

We present an unsupervised approach for discovering semantic representat...
research
05/15/2018

Unsupervised Learning of Style-sensitive Word Vectors

This paper presents the first study aimed at capturing stylistic similar...
research
09/29/2017

Synonym Discovery with Etymology-based Word Embeddings

We propose a novel approach to learn word embeddings based on an extende...
research
06/20/2018

The Corpus Replication Task

In the field of Natural Language Processing (NLP), we revisit the well-k...
research
04/17/2021

Frequency-based Distortions in Contextualized Word Embeddings

How does word frequency in pre-training data affect the behavior of simi...
research
01/27/2020

Neural Activation Semantic Models: Computational lexical semantic models of localized neural activations

Neural activation models that have been proposed in the literature use a...

Please sign up or login with your details

Forgot password? Click here to reset