Robust and Consistent Estimation of Word Embedding for Bangla Language by fine-tuning Word2Vec Model

10/26/2020
by   Rifat Rahman, et al.
0

Word embedding or vector representation of word holds syntactical and semantic characteristics of word which can be an informative feature for any machine learning based models of natural language processing. There are several deep learning based models for the vectorization of words like word2vec, fasttext, gensim, glove etc. In this study, we analysis word2vec model for learning word vectors by tuning different hyper-parameters and present the most effective word embedding for Bangla language. For testing the performances of different word embeddings induced by fine-tuning of word2vec model, we perform both intrinsic and extrinsic evaluations. We cluster the word vectors to examine the relational similarity of words and also use different word embeddings as the feature of news article classifier for extrinsic evaluation. From our experiment, we discover that the word vectors with 300 dimension, generated from 'skip-gram' method of word2vec model using the sliding window size of 4, are giving the most robust vector representations for Bangla language.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro