Characterizing the impact of geometric properties of word embeddings on task performance

04/09/2019
by   -, et al.
0

Analysis of word embedding properties to inform their use in downstream NLP tasks has largely been studied by assessing nearest neighbors. However, geometric properties of the continuous feature space contribute directly to the use of embedding features in downstream models, and are largely unexplored. We consider four properties of word embedding geometry, namely: position relative to the origin, distribution of features in the vector space, global pairwise distances, and local pairwise distances. We define a sequence of transformations to generate new embeddings that expose subsets of these properties to downstream models and evaluate change in task performance to understand the contribution of each property to NLP models. We transform publicly available pretrained embeddings from three popular toolkits (word2vec, GloVe, and FastText) and evaluate on a variety of intrinsic tasks, which model linguistic information in the vector space, and extrinsic tasks, which use vectors as input to machine learning models. We find that intrinsic evaluations are highly sensitive to absolute position, while extrinsic tasks rely primarily on local similarity. Our findings suggest that future embedding models and post-processing techniques should focus primarily on similarity to nearby points in vector space.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/05/2023

PWESuite: Phonetic Word Embeddings and Tasks They Facilitate

Word embeddings that map words into a fixed-dimensional vector space are...
research
09/12/2019

Retrofitting Contextualized Word Embeddings with Paraphrases

Contextualized word embedding models, such as ELMo, generate meaningful ...
research
06/07/2022

How to Dissect a Muppet: The Structure of Transformer Embedding Spaces

Pretrained embeddings based on the Transformer architecture have taken t...
research
08/16/2021

IsoScore: Measuring the Uniformity of Vector Space Utilization

The recent success of distributed word representations has led to an inc...
research
06/25/2016

Intrinsic Subspace Evaluation of Word Embedding Representations

We introduce a new methodology for intrinsic evaluation of word represen...
research
02/24/2017

Consistent Alignment of Word Embedding Models

Word embedding models offer continuous vector representations that can c...
research
07/31/2017

Skill2vec: Machine Learning Approaches for Determining the Relevant Skill from Job Description

Un-supervise learned word embeddings have seen tremendous success in num...

Please sign up or login with your details

Forgot password? Click here to reset