Tailoring Word Embeddings for Bilexical Predictions: An Experimental Comparison

We investigate the problem of inducing word embeddings that are tailored for a particular bilexical relation. Our learning algorithm takes an existing lexical vector space and compresses it such that the resulting word embeddings are good predictors for a target bilexical relation. In experiments we show that task-specific embeddings can benefit both the quality and efficiency in lexical prediction tasks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/04/2015

AutoExtend: Extending Word Embeddings to Embeddings for Synsets and Lexemes

We present AutoExtend, a system to learn embeddings for synsets and lexe...
research
02/28/2015

Task-Oriented Learning of Word Embeddings for Semantic Relation Classification

We present a novel learning method for word embeddings designed for rela...
research
07/21/2017

Reconstruction of Word Embeddings from Sub-Word Parameters

Pre-trained word embeddings improve the performance of a neural model at...
research
09/05/2015

Take and Took, Gaggle and Goose, Book and Read: Evaluating the Utility of Vector Differences for Lexical Relation Learning

Recent work on word embeddings has shown that simple vector subtraction ...
research
05/23/2018

Scoring Lexical Entailment with a Supervised Directional Similarity Network

We present the Supervised Directional Similarity Network (SDSN), a novel...
research
04/02/2021

Query2Prod2Vec Grounded Word Embeddings for eCommerce

We present Query2Prod2Vec, a model that grounds lexical representations ...
research
01/01/2021

Key Phrase Extraction Applause Prediction

With the increase in content availability over the internet it is very d...

Please sign up or login with your details

Forgot password? Click here to reset