Think Globally, Embed Locally --- Locally Linear Meta-embedding of Words

09/19/2017
by   Danushka Bollegala, et al.
0

Distributed word embeddings have shown superior performances in numerous Natural Language Processing (NLP) tasks. However, their performances vary significantly across different tasks, implying that the word embeddings learnt by those methods capture complementary aspects of lexical semantics. Therefore, we believe that it is important to combine the existing word embeddings to produce more accurate and complete meta-embeddings of words. For this purpose, we propose an unsupervised locally linear meta-embedding learning method that takes pre-trained word embeddings as the input, and produces more accurate meta embeddings. Unlike previously proposed meta-embedding learning methods that learn a global projection over all words in a vocabulary, our proposed method is sensitive to the differences in local neighbourhoods of the individual source word embeddings. Moreover, we show that vector concatenation, a previously proposed highly competitive baseline approach for integrating word embeddings, can be derived as a special case of the proposed method. Experimental results on semantic similarity, word analogy, relation classification, and short-text classification tasks show that our meta-embeddings to significantly outperform prior methods in several benchmark datasets, establishing a new state of the art for meta-embeddings.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/26/2022

Learning Meta Word Embeddings by Unsupervised Weighted Concatenation of Source Embeddings

Given multiple source word embeddings learnt using diverse algorithms an...
research
08/18/2015

Learning Meta-Embeddings by Using Ensembles of Embedding Sets

Word embeddings -- distributed representations of words -- in deep learn...
research
12/01/2020

Meta-Embeddings for Natural Language Inference and Semantic Similarity tasks

Word Representations form the core component for almost all advanced Nat...
research
09/04/2017

Learning Neural Word Salience Scores

Measuring the salience of a word is an essential step in numerous NLP ta...
research
04/16/2022

Unsupervised Attention-based Sentence-Level Meta-Embeddings from Contextualised Language Models

A variety of contextualised language models have been proposed in the NL...
research
04/25/2022

A Survey on Word Meta-Embedding Learning

Meta-embedding (ME) learning is an emerging approach that attempts to le...
research
12/27/2019

Encoding word order in complex embeddings

Sequential word order is important when processing text. Currently, neur...

Please sign up or login with your details

Forgot password? Click here to reset