DeepAI AI Chat
Log In Sign Up

Task-Oriented Learning of Word Embeddings for Semantic Relation Classification

02/28/2015
by   Kazuma Hashimoto, et al.
toyota-ti.ac.jp
The University of Tokyo
0

We present a novel learning method for word embeddings designed for relation classification. Our word embeddings are trained by predicting words between noun pairs using lexical relation-specific features on a large unlabeled corpus. This allows us to explicitly incorporate relation-specific information into the word embeddings. The learned word embeddings are then used to construct feature vectors for a relation classification model. On a well-established semantic relation classification task, our method significantly outperforms a baseline based on a previously introduced word embedding method, and compares favorably to previous state-of-the-art models that use syntactic information or manually constructed external resources.

READ FULL TEXT

page 1

page 2

page 3

page 4

12/22/2014

Tailoring Word Embeddings for Bilexical Predictions: An Experimental Comparison

We investigate the problem of inducing word embeddings that are tailored...
01/22/2021

Enhanced word embeddings using multi-semantic representation through lexical chains

The relationship between words in a sentence often tells us more about t...
04/19/2018

LightRel SemEval-2018 Task 7: Lightweight and Fast Relation Classification

We present LightRel, a lightweight and fast relation classifier. Our goa...
06/06/2019

Derivational Morphological Relations in Word Embeddings

Derivation is a type of a word-formation process which creates new words...
03/31/2022

A bilingual approach to specialised adjectives through word embeddings in the karstology domain

We present an experiment in extracting adjectives which express a specif...
03/09/2017

What can you do with a rock? Affordance extraction via word embeddings

Autonomous agents must often detect affordances: the set of behaviors en...