DeepAI AI Chat
Log In Sign Up

Temporal Information Extraction for Question Answering Using Syntactic Dependencies in an LSTM-based Architecture

by   Yuanliang Meng, et al.
UMass Lowell

In this paper, we propose to use a set of simple, uniform in architecture LSTM-based models to recover different kinds of temporal relations from text. Using the shortest dependency path between entities as input, the same architecture is used to extract intra-sentence, cross-sentence, and document creation time relations. A "double-checking" technique reverses entity pairs in classification, boosting the recall of positive cases and reducing misclassifications between opposite classes. An efficient pruning algorithm resolves conflicts globally. Evaluated on QA-TempEval (SemEval2015 Task 5), our proposed technique outperforms state-of-the-art methods by a large margin.


page 1

page 2

page 3

page 4


Answer Extraction in Question Answering using Structure Features and Dependency Principles

Question Answering (QA) research is a significant and challenging task i...

Learning Sentence-internal Temporal Relations

In this paper we propose a data intensive approach for inferring sentenc...

TransferNet: An Effective and Transparent Framework for Multi-hop Question Answering over Relation Graph

Multi-hop Question Answering (QA) is a challenging task because it requi...

A Sequential Model for Classifying Temporal Relations between Intra-Sentence Events

We present a sequential model for temporal relation classification betwe...

Event Detection as Question Answering with Entity Information

In this paper, we propose a recent and under-researched paradigm for the...

SERC: Syntactic and Semantic Sequence based Event Relation Classification

Temporal and causal relations play an important role in determining the ...

Temporal Information Extraction by Predicting Relative Time-lines

The current leading paradigm for temporal information extraction from te...