Incorporating Both Distributional and Relational Semantics in Word Representations

12/18/2014
by   Daniel Fried, et al.
0

We investigate the hypothesis that word representations ought to incorporate both distributional and relational semantics. To this end, we employ the Alternating Direction Method of Multipliers (ADMM), which flexibly optimizes a distributional objective on raw text and a relational objective on WordNet. Preliminary results on knowledge base completion, analogy tests, and parsing show that word representations trained on both objectives can give improvements in some cases.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/24/2019

Distributional Analysis of Function Words

This paper is a first attempt at reconciling the current methods of dist...
research
09/14/2017

KBLRN : End-to-End Learning of Knowledge Base Representations with Latent, Relational, and Numerical Features

We present KBLRN, a novel framework for end-to-end learning of knowledge...
research
05/06/2019

Distributional Semantics and Linguistic Theory

Distributional semantics provides multi-dimensional, graded, empirically...
research
05/18/2016

Leveraging Lexical Resources for Learning Entity Embeddings in Multi-Relational Data

Recent work in learning vector-space embeddings for multi-relational dat...
research
11/14/2014

Learning Multi-Relational Semantics Using Neural-Embedding Models

In this paper we present a unified framework for modeling multi-relation...
research
10/06/2021

Relation Prediction as an Auxiliary Training Objective for Improving Multi-Relational Graph Representations

Learning good representations on multi-relational graphs is essential to...
research
01/23/2020

Learning Distributional Programs for Relational Autocompletion

Relational autocompletion is the problem of automatically filling out so...

Please sign up or login with your details

Forgot password? Click here to reset