Global Textual Relation Embedding for Relational Understanding

06/03/2019
by   Zhiyu Chen, et al.
0

Pre-trained embeddings such as word embeddings and sentence embeddings are fundamental tools facilitating a wide range of downstream NLP tasks. In this work, we investigate how to learn a general-purpose embedding of textual relations, defined as the shortest dependency path between entities. Textual relation embedding provides a level of knowledge between word/phrase level and sentence level, and we show that it can facilitate downstream tasks requiring relational understanding of the text. To learn such an embedding, we create the largest distant supervision dataset by linking the entire English ClueWeb09 corpus to Freebase. We use global co-occurrence statistics between textual and knowledge base relations as the supervision signal to train the embedding. Evaluation on two relational understanding tasks demonstrates the usefulness of the learned textual relation embedding. The data and code can be found at https://github.com/czyssrs/GloREPlus

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/19/2017

Global Relation Embedding for Relation Extraction

Recent studies have shown that embedding textual relations using deep ne...
research
12/17/2022

Relational Sentence Embedding for Flexible Semantic Matching

We present Relational Sentence Embedding (RSE), a new paradigm to furthe...
research
05/21/2018

Sentence Modeling via Multiple Word Embeddings and Multi-level Comparison for Semantic Textual Similarity

Different word embedding models capture different aspects of linguistic ...
research
04/04/2019

Composition of Sentence Embeddings:Lessons from Statistical Relational Learning

Various NLP problems -- such as the prediction of sentence similarity, e...
research
11/28/2019

RETRO: Relation Retrofitting For In-Database Machine Learning on Textual Data

There are massive amounts of textual data residing in databases, valuabl...
research
08/06/2023

3D-EX : A Unified Dataset of Definitions and Dictionary Examples

Definitions are a fundamental building block in lexicography, linguistic...
research
09/12/2017

StarSpace: Embed All The Things!

We present StarSpace, a general-purpose neural embedding model that can ...

Please sign up or login with your details

Forgot password? Click here to reset