Attention Word Embedding

06/01/2020
by   Shashank Sonkar, et al.
0

Word embedding models learn semantically rich vector representations of words and are widely used to initialize natural processing language (NLP) models. The popular continuous bag-of-words (CBOW) model of word2vec learns a vector embedding by masking a given word in a sentence and then using the other words as a context to predict it. A limitation of CBOW is that it equally weights the context words when making a prediction, which is inefficient, since some words have higher predictive value than others. We tackle this inefficiency by introducing the Attention Word Embedding (AWE) model, which integrates the attention mechanism into the CBOW model. We also propose AWE-S, which incorporates subword information. We demonstrate that AWE and AWE-S outperform the state-of-the-art word embedding models both on a variety of word similarity datasets and when used for initialization of NLP models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/29/2022

Chinese Word Sense Embedding with SememeWSD and Synonym Set

Word embedding is a fundamental natural language processing task which c...
research
08/21/2018

Gaussian Word Embedding with a Wasserstein Distance Loss

Comparing with word embedding that based on the point representation, di...
research
02/24/2017

Consistent Alignment of Word Embedding Models

Word embedding models offer continuous vector representations that can c...
research
02/03/2023

Improving Interpretability via Explicit Word Interaction Graph Layer

Recent NLP literature has seen growing interest in improving model inter...
research
07/22/2016

Novel Word Embedding and Translation-based Language Modeling for Extractive Speech Summarization

Word embedding methods revolve around learning continuous distributed ve...
research
12/04/2018

Twitter-based traffic information system based on vector representations for words

Recently, researchers have shown an increased interest in harnessing Twi...
research
06/09/2015

WordRank: Learning Word Embeddings via Robust Ranking

Embedding words in a vector space has gained a lot of attention in recen...

Please sign up or login with your details

Forgot password? Click here to reset