Improving Slot Filling Performance with Attentive Neural Networks on Dependency Structures

07/04/2017
by   Lifu Huang, et al.
0

Slot Filling (SF) aims to extract the values of certain types of attributes (or slots, such as person:cities_of_residence) for a given entity from a large collection of source documents. In this paper we propose an effective DNN architecture for SF with the following new strategies: (1). Take a regularized dependency graph instead of a raw sentence as input to DNN, to compress the wide contexts between query and candidate filler; (2). Incorporate two attention mechanisms: local attention learned from query and candidate filler, and global attention learned from external knowledge bases, to guide the model to better select indicative contexts to determine slot type. Experiments show that this framework outperforms state-of-the-art on both relation extraction (16% absolute F-score gain) and slot filling validation for each individual system (up to 8.5% absolute F-score gain).

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/16/2016

Comparing Convolutional Neural Networks to Traditional Models for Slot Filling

We address relation classification in the context of slot filling, the t...
research
10/01/2019

Type-aware Convolutional Neural Networks for Slot Filling

The slot filling task aims at extracting answers for queries about entit...
research
10/19/2022

Explainable Slot Type Attentions to Improve Joint Intent Detection and Slot Filling

Joint intent detection and slot filling is a key research topic in natur...
research
06/03/2021

GL-GIN: Fast and Accurate Non-Autoregressive Model for Joint Multiple Intent Detection and Slot Filling

Multi-intent SLU can handle multiple intents in an utterance, which has ...
research
10/26/2017

Impact of Coreference Resolution on Slot Filling

In this paper, we demonstrate the importance of coreference resolution f...
research
12/21/2020

Encoding Syntactic Knowledge in Transformer Encoder for Intent Detection and Slot Filling

We propose a novel Transformer encoder-based architecture with syntactic...
research
11/06/2018

CIS at TAC Cold Start 2015: Neural Networks and Coreference Resolution for Slot Filling

This paper describes the CIS slot filling system for the TAC Cold Start ...

Please sign up or login with your details

Forgot password? Click here to reset