Severing the Edge Between Before and After: Neural Architectures for Temporal Ordering of Events

04/08/2020
by   Miguel Ballesteros, et al.
0

In this paper, we propose a neural architecture and a set of training methods for ordering events by predicting temporal relations. Our proposed models receive a pair of events within a span of text as input and they identify temporal relations (Before, After, Equal, Vague) between them. Given that a key challenge with this task is the scarcity of annotated data, our models rely on either pretrained representations (i.e. RoBERTa, BERT or ELMo), transfer and multi-task learning (by leveraging complementary datasets), and self-training techniques. Experiments on the MATRES dataset of English documents establish a new state-of-the-art on this task.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/19/2019

Embedding time expressions for deep temporal ordering models

Data-driven models have demonstrated state-of-the-art performance in inf...
research
09/18/2018

Transfer and Multi-Task Learning for Noun-Noun Compound Interpretation

In this paper, we empirically evaluate the utility of transfer and multi...
research
04/14/2015

Temporal ordering of clinical events

This report describes a minimalistic set of methods engineered to anchor...
research
05/15/2021

STAGE: Tool for Automated Extraction of Semantic Time Cues to Enrich Neural Temporal Ordering Models

Despite achieving state-of-the-art accuracy on temporal ordering of even...
research
02/11/2022

Learning Temporal Rules from Noisy Timeseries Data

Events across a timeline are a common data representation, seen in diffe...
research
09/16/2020

Reasoning about Goals, Steps, and Temporal Ordering with WikiHow

We propose a suite of reasoning tasks on two types of relations between ...
research
07/17/2018

Power Networks: A Novel Neural Architecture to Predict Power Relations

Can language analysis reveal the underlying social power relations that ...

Please sign up or login with your details

Forgot password? Click here to reset