Deconstructing Word Embeddings

01/08/2019
by   Koushik Varma Kalidindi, et al.
0

A review of Word Embedding Models through a deconstructive approach reveals their several shortcomings and inconsistencies. These include instability of the vector representations, a distorted analogical reasoning, geometric incompatibility with linguistic features, and the inconsistencies in the corpus data. A new theoretical embedding model, Derridian Embedding, is proposed in this paper. Contemporary embedding models are evaluated qualitatively in terms of how adequate they are in relation to the capabilities of a Derridian Embedding.

READ FULL TEXT
research
09/12/2019

Retrofitting Contextualized Word Embeddings with Paraphrases

Contextualized word embedding models, such as ELMo, generate meaningful ...
research
06/14/2023

Contrastive Loss is All You Need to Recover Analogies as Parallel Lines

While static word embedding models are known to represent linguistic ana...
research
06/22/2016

Toward Word Embedding for Personalized Information Retrieval

This paper presents preliminary works on using Word Embedding (word2vec)...
research
01/04/2022

Predicting Influenza A Viral Host Using PSSM and Word Embeddings

The rapid mutation of the influenza virus threatens public health. Reass...
research
03/04/2019

Relation Extraction Datasets in the Digital Humanities Domain and their Evaluation with Word Embeddings

In this research, we manually create high-quality datasets in the digita...
research
06/20/2018

Opinion Dynamics Modeling for Movie Review Transcripts Classification with Hidden Conditional Random Fields

In this paper, the main goal is to detect a movie reviewer's opinion usi...
research
02/21/2018

CoVeR: Learning Covariate-Specific Vector Representations with Tensor Decompositions

Word embedding is a useful approach to capture co-occurrence structures ...

Please sign up or login with your details

Forgot password? Click here to reset