Learning Efficient Task-Specific Meta-Embeddings with Word Prisms

11/05/2020
by   Jingyi He, et al.
6

Word embeddings are trained to predict word cooccurrence statistics, which leads them to possess different lexical properties (syntactic, semantic, etc.) depending on the notion of context defined at training time. These properties manifest when querying the embedding space for the most similar vectors, and when used at the input layer of deep neural networks trained to solve downstream NLP problems. Meta-embeddings combine multiple sets of differently trained word embeddings, and have been shown to successfully improve intrinsic and extrinsic performance over equivalent models which use just one set of source embeddings. We introduce word prisms: a simple and efficient meta-embedding method that learns to combine source embeddings according to the task at hand. Word prisms learn orthogonal transformations to linearly combine the input source embeddings, which allows them to be very efficient at inference time. We evaluate word prisms in comparison to other meta-embedding methods on six extrinsic evaluations and observe that word prisms offer improvements in performance on all tasks.

READ FULL TEXT
research
04/26/2022

Learning Meta Word Embeddings by Unsupervised Weighted Concatenation of Source Embeddings

Given multiple source word embeddings learnt using diverse algorithms an...
research
10/08/2015

Mapping Unseen Words to Task-Trained Embedding Spaces

We consider the supervised training setting in which we learn task-speci...
research
08/17/2022

Visual Comparison of Language Model Adaptation

Neural language models are widely used; however, their model parameters ...
research
10/11/2019

Evaluating Semantic Representations of Source Code

Learned representations of source code enable various software developer...
research
02/24/2016

Ultradense Word Embeddings by Orthogonal Transformation

Embeddings are generic representations that are useful for many NLP task...
research
10/11/2021

A Comprehensive Comparison of Word Embeddings in Event Entity Coreference Resolution

Coreference Resolution is an important NLP task and most state-of-the-ar...
research
04/25/2022

A Survey on Word Meta-Embedding Learning

Meta-embedding (ME) learning is an emerging approach that attempts to le...

Please sign up or login with your details

Forgot password? Click here to reset