Implicitly Incorporating Morphological Information into Word Embedding

01/10/2017
by   Yang Xu, et al.
0

In this paper, we propose three novel models to enhance word embedding by implicitly using morphological information. Experiments on word similarity and syntactic analogy show that the implicit models are superior to traditional explicit ones. Our models outperform all state-of-the-art baselines and significantly improve the performance on both tasks. Moreover, our performance on the smallest corpus is similar to the performance of CBOW on the corpus which is five times the size of ours. Parameter analysis indicates that the implicit models can supplement semantic information during the word embedding training process.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/20/2015

How to Generate a Good Word Embedding?

We analyze three critical components of word embedding training: the mod...
research
04/01/2019

Syntactic Interchangeability in Word Embedding Models

Nearest neighbors in word embedding models are commonly observed to be s...
research
09/02/2020

On SkipGram Word Embedding Models with Negative Sampling: Unified Framework and Impact of Noise Distributions

SkipGram word embedding models with negative sampling, or SGN in short, ...
research
03/11/2021

Evaluation of Morphological Embeddings for English and Russian Languages

This paper evaluates morphology-based embeddings for English and Russian...
research
08/01/2021

Realised Volatility Forecasting: Machine Learning via Financial Word Embedding

We develop FinText, a novel, state-of-the-art, financial word embedding ...
research
12/19/2022

Norm of word embedding encodes information gain

Distributed representations of words encode lexical semantic information...
research
01/19/2018

Size vs. Structure in Training Corpora for Word Embedding Models: Araneum Russicum Maximum and Russian National Corpus

In this paper, we present a distributional word embedding model trained ...

Please sign up or login with your details

Forgot password? Click here to reset