Robust and Consistent Estimation of Word Embedding for Bangla Language by fine-tuning Word2Vec Model

10/26/2020
by   Rifat Rahman, et al.
0

Word embedding or vector representation of word holds syntactical and semantic characteristics of word which can be an informative feature for any machine learning based models of natural language processing. There are several deep learning based models for the vectorization of words like word2vec, fasttext, gensim, glove etc. In this study, we analysis word2vec model for learning word vectors by tuning different hyper-parameters and present the most effective word embedding for Bangla language. For testing the performances of different word embeddings induced by fine-tuning of word2vec model, we perform both intrinsic and extrinsic evaluations. We cluster the word vectors to examine the relational similarity of words and also use different word embeddings as the feature of news article classifier for extrinsic evaluation. From our experiment, we discover that the word vectors with 300 dimension, generated from 'skip-gram' method of word2vec model using the sliding window size of 4, are giving the most robust vector representations for Bangla language.

READ FULL TEXT

page 5

page 6

research
09/08/2018

Exploration on Grounded Word Embedding: Matching Words and Images with Image-Enhanced Skip-Gram Model

Word embedding is designed to represent the semantic meaning of a word w...
research
12/01/2020

Intrinsic analysis for dual word embedding space models

Recent word embeddings techniques represent words in a continuous vector...
research
08/08/2021

Efficacy of BERT embeddings on predicting disaster from Twitter data

Social media like Twitter provide a common platform to share and communi...
research
11/03/2019

Low-dimensional Semantic Space: from Text to Word Embedding

This article focuses on the study of Word Embedding, a feature-learning ...
research
02/27/2017

Dynamic Word Embeddings

We present a probabilistic language model for time-stamped text data whi...
research
04/19/2017

Redefining Context Windows for Word Embedding Models: An Experimental Study

Distributional semantic models learn vector representations of words thr...
research
03/07/2020

Discovering linguistic (ir)regularities in word embeddings through max-margin separating hyperplanes

We experiment with new methods for learning how related words are positi...

Please sign up or login with your details

Forgot password? Click here to reset