Improving Distant Supervision with Maxpooled Attention and Sentence-Level Supervision

10/30/2018
by   Iz Beltagy, et al.
0

We propose an effective multitask learning setup for reducing distant supervision noise by leveraging sentence-level supervision. We show how sentence-level supervision can be used to improve the encoding of individual sentences, and to learn which input sentences are more likely to express the relationship between a pair of entities. We also introduce a novel neural architecture for collecting signals from multiple input sentences, which combines the benefits of attention and maxpooling. The proposed method increases AUC by 10 results on the FB-NYT dataset.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/19/2018

Improving Distantly Supervised Relation Extraction using Word and Entity Based Attention

Relation extraction is the problem of classifying the relationship betwe...
research
09/21/2017

Inducing Distant Supervision in Suggestion Mining through Part-of-Speech Embeddings

Mining suggestion expressing sentences from a given text is a less inves...
research
12/22/2018

Distant Supervision for Relation Extraction with Linear Attenuation Simulation and Non-IID Relevance Embedding

Distant supervision for relation extraction is an efficient method to re...
research
09/26/2020

Reinforcement Learning-based N-ary Cross-Sentence Relation Extraction

The models of n-ary cross sentence relation extraction based on distant ...
research
08/24/2018

Reinforcement Learning for Relation Classification from Noisy Data

Existing relation classification methods that rely on distant supervisio...
research
09/26/2017

Dataset Construction via Attention for Aspect Term Extraction with Distant Supervision

Aspect Term Extraction (ATE) detects opinionated aspect terms in sentenc...
research
12/16/2019

Graph-based Neural Sentence Ordering

Sentence ordering is to restore the original paragraph from a set of sen...

Please sign up or login with your details

Forgot password? Click here to reset