Estimator Vectors: OOV Word Embeddings based on Subword and Context Clue Estimates

10/18/2019
by   Raj Patel, et al.
0

Semantic representations of words have been successfully extracted from unlabeled corpuses using neural network models like word2vec. These representations are generally high quality and are computationally inexpensive to train, making them popular. However, these approaches generally fail to approximate out of vocabulary (OOV) words, a task humans can do quite easily, using word roots and context clues. This paper proposes a neural network model that learns high quality word representations, subword representations, and context clue representations jointly. Learning all three types of representations together enhances the learning of each, leading to enriched word vectors, along with strong estimates for OOV words, via the combination of the corresponding context clue and subword embeddings. Our model, called Estimator Vectors (EV), learns strong word embeddings and is competitive with state of the art methods for OOV estimation.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/15/2021

Deriving Word Vectors from Contextualized Language Models using Topic-Aware Mention Selection

One of the long-standing challenges in lexical semantics consists in lea...
research
07/17/2019

Analysis of Word Embeddings using Fuzzy Clustering

In data dominated systems and applications, a concept of representing wo...
research
06/15/2016

Siamese CBOW: Optimizing Word Embeddings for Sentence Representations

We present the Siamese Continuous Bag of Words (Siamese CBOW) model, a n...
research
05/14/2018

A La Carte Embedding: Cheap but Effective Induction of Semantic Feature Vectors

Motivations like domain adaptation, transfer learning, and feature learn...
research
02/26/2019

Context Vectors are Reflections of Word Vectors in Half the Dimensions

This paper takes a step towards theoretical analysis of the relationship...
research
07/19/2019

An Unsupervised Character-Aware Neural Approach to Word and Context Representation Learning

In the last few years, neural networks have been intensively used to dev...
research
05/30/2018

What the Vec? Towards Probabilistically Grounded Embeddings

Vector representation, or embedding, of words is commonly achieved with ...

Please sign up or login with your details

Forgot password? Click here to reset