Evaluating Neural Word Embeddings for Sanskrit

04/01/2021
by   Jivnesh Sandhan, et al.
0

Recently, the supervised learning paradigm's surprisingly remarkable performance has garnered considerable attention from Sanskrit Computational Linguists. As a result, the Sanskrit community has put laudable efforts to build task-specific labeled data for various downstream Natural Language Processing (NLP) tasks. The primary component of these approaches comes from representations of word embeddings. Word embedding helps to transfer knowledge learned from readily available unlabelled data for improving task-specific performance in low-resource setting. Last decade, there has been much excitement in the field of digitization of Sanskrit. To effectively use such readily available resources, it is very much essential to perform a systematic study on word embedding approaches for the Sanskrit language. In this work, we investigate the effectiveness of word embeddings. We classify word embeddings in broad categories to facilitate systematic experimentation and evaluate them on four intrinsic tasks. We investigate the efficacy of embeddings approaches (originally proposed for languages other than Sanskrit) for Sanskrit along with various challenges posed by language.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/20/2017

Portuguese Word Embeddings: Evaluating on Word Analogies and Natural Language Tasks

Word embeddings have been found to provide meaningful representations fo...
research
08/07/2018

Word-Level Loss Extensions for Neural Temporal Relation Classification

Unsupervised pre-trained word embeddings are used effectively for many t...
research
01/10/2020

MoRTy: Unsupervised Learning of Task-specialized Word Embeddings by Autoencoding

Word embeddings have undoubtedly revolutionized NLP. However, pre-traine...
research
06/14/2018

GLoMo: Unsupervisedly Learned Relational Graphs as Transferable Representations

Modern deep transfer learning approaches have mainly focused on learning...
research
11/24/2017

An Exploration of Word Embedding Initialization in Deep-Learning Tasks

Word embeddings are the interface between the world of discrete units of...
research
06/16/2022

TransDrift: Modeling Word-Embedding Drift using Transformer

In modern NLP applications, word embeddings are a crucial backbone that ...
research
09/12/2018

Graph Convolutional Networks based Word Embeddings

Recently, word embeddings have been widely adopted across several NLP ap...

Please sign up or login with your details

Forgot password? Click here to reset