Using Dynamic Embeddings to Improve Static Embeddings

11/07/2019
by   Yile Wang, et al.
0

How to build high-quality word embeddings is a fundamental research question in the field of natural language processing. Traditional methods such as Skip-Gram and Continuous Bag-of-Words learn static embeddings by training lookup tables that translate words into dense vectors. Static embeddings are directly useful for solving lexical semantics tasks, and can be used as input representations for downstream problems. Recently, contextualized embeddings such as BERT have been shown more effective than static embeddings as NLP input embeddings. Such embeddings are dynamic, calculated according to a sentential context using a network structure. One limitation of dynamic embeddings, however, is that they cannot be used without a sentence-level context. We explore the advantages of dynamic embeddings for training static embeddings, by using contextualized embeddings to facilitate training of static embedding lookup tables. Results show that the resulting embeddings outperform existing static embedding methods on various lexical semantics tasks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/05/2021

Learning Sense-Specific Static Embeddings using Contextualised Word Embeddings as a Proxy

Contextualised word embeddings generated from Neural Language Models (NL...
research
12/22/2020

Improved Biomedical Word Embeddings in the Transformer Era

Biomedical word embeddings are usually pre-trained on free text corpora ...
research
04/06/2023

Static Fuzzy Bag-of-Words: a lightweight sentence embedding algorithm

The introduction of embedding techniques has pushed forward significantl...
research
03/30/2021

Representing ELMo embeddings as two-dimensional text online

We describe a new addition to the WebVectors toolkit which is used to se...
research
05/07/2022

Odor Descriptor Understanding through Prompting

Embeddings from contemporary natural language processing (NLP) models ar...
research
08/08/2021

Efficacy of BERT embeddings on predicting disaster from Twitter data

Social media like Twitter provide a common platform to share and communi...
research
01/18/2021

Alignment and stability of embeddings: measurement and inference improvement

Representation learning (RL) methods learn objects' latent embeddings wh...

Please sign up or login with your details

Forgot password? Click here to reset