A Survey On Neural Word Embeddings

10/05/2021
by   Erhan Sezerer, et al.
0

Understanding human language has been a sub-challenge on the way of intelligent machines. The study of meaning in natural language processing (NLP) relies on the distributional hypothesis where language elements get meaning from the words that co-occur within contexts. The revolutionary idea of distributed representation for a concept is close to the working of a human mind in that the meaning of a word is spread across several neurons, and a loss of activation will only slightly affect the memory retrieval process. Neural word embeddings transformed the whole field of NLP by introducing substantial improvements in all NLP tasks. In this survey, we provide a comprehensive literature review on neural word embeddings. We give theoretical foundations and describe existing work by an interplay between word embeddings and language modelling. We provide broad coverage on neural word embeddings, including early word embeddings, embeddings targeting specific semantic relations, sense embeddings, morpheme embeddings, and finally, contextual representations. Finally, we describe benchmark datasets in word embeddings' performance evaluation and downstream tasks along with the performance results of/due to word embeddings.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/30/2020

Development of Word Embeddings for Uzbek Language

In this paper, we share the process of developing word embeddings for th...
research
01/25/2019

Word Embeddings: A Survey

This work lists and describes the main recent strategies for building fi...
research
06/24/2019

Language Modelling Makes Sense: Propagating Representations through WordNet for Full-Coverage Word Sense Disambiguation

Contextual embeddings represent a new generation of semantic representat...
research
05/10/2018

From Word to Sense Embeddings: A Survey on Vector Representations of Meaning

Over the past years, distributed representations have proven effective a...
research
10/23/2020

Domain Specific Complex Sentence (DCSC) Semantic Similarity Dataset

Semantic textual similarity is one of the open research challenges in th...
research
02/25/2020

Semantic Relatedness for Keyword Disambiguation: Exploiting Different Embeddings

Understanding the meaning of words is crucial for many tasks that involv...
research
05/26/2021

LMMS Reloaded: Transformer-based Sense Embeddings for Disambiguation and Beyond

Distributional semantics based on neural approaches is a cornerstone of ...

Please sign up or login with your details

Forgot password? Click here to reset