An Attentive Neural Architecture for Fine-grained Entity Type Classification

04/19/2016
by   Sonse Shimaoka, et al.
0

In this work we propose a novel attention-based neural network model for the task of fine-grained entity type classification that unlike previously proposed models recursively composes representations of entity mention contexts. Our model achieves state-of-the-art performance with 74.94 the well-established FIGER dataset, a relative improvement of 2.59 investigate the behavior of the attention mechanism of our model and observe that it can learn contextual linguistic expressions that indicate the fine-grained category memberships of an entity.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/04/2016

Neural Architectures for Fine-grained Entity Type Classification

In this work, we investigate several neural network architectures for fi...
research
04/21/2018

Fine-grained Entity Typing through Increased Discourse Context and Adaptive Classification Thresholds

Fine-grained entity typing is the task of assigning fine-grained semanti...
research
09/13/2021

Fine-grained Entity Typing via Label Reasoning

Conventional entity typing approaches are based on independent classific...
research
09/20/2018

Building Context-aware Clause Representations for Situation Entity Type Classification

Capabilities to categorize a clause based on the type of situation entit...
research
09/30/2019

Weakly Supervised Attention Networks for Fine-Grained Opinion Mining and Public Health

In many review classification applications, a fine-grained analysis of t...
research
11/23/2018

Fine Grained Classification of Personal Data Entities

Entity Type Classification can be defined as the task of assigning categ...
research
07/19/2018

Attend and Rectify: a Gated Attention Mechanism for Fine-Grained Recovery

We propose a novel attention mechanism to enhance Convolutional Neural N...

Please sign up or login with your details

Forgot password? Click here to reset