DeepAI AI Chat
Log In Sign Up

Task-Oriented Learning of Word Embeddings for Semantic Relation Classification

by   Kazuma Hashimoto, et al.
The University of Tokyo

We present a novel learning method for word embeddings designed for relation classification. Our word embeddings are trained by predicting words between noun pairs using lexical relation-specific features on a large unlabeled corpus. This allows us to explicitly incorporate relation-specific information into the word embeddings. The learned word embeddings are then used to construct feature vectors for a relation classification model. On a well-established semantic relation classification task, our method significantly outperforms a baseline based on a previously introduced word embedding method, and compares favorably to previous state-of-the-art models that use syntactic information or manually constructed external resources.


page 1

page 2

page 3

page 4


Tailoring Word Embeddings for Bilexical Predictions: An Experimental Comparison

We investigate the problem of inducing word embeddings that are tailored...

Enhanced word embeddings using multi-semantic representation through lexical chains

The relationship between words in a sentence often tells us more about t...

LightRel SemEval-2018 Task 7: Lightweight and Fast Relation Classification

We present LightRel, a lightweight and fast relation classifier. Our goa...

Derivational Morphological Relations in Word Embeddings

Derivation is a type of a word-formation process which creates new words...

A bilingual approach to specialised adjectives through word embeddings in the karstology domain

We present an experiment in extracting adjectives which express a specif...

What can you do with a rock? Affordance extraction via word embeddings

Autonomous agents must often detect affordances: the set of behaviors en...