Exploring Contextualized Neural Language Models for Temporal Dependency Parsing

04/30/2020
by   Hayley Ross, et al.
0

Extracting temporal relations between events and time expressions has many applications such as constructing event timelines and time-related question answering. It is a challenging problem that requires syntactic and semantic information at sentence or discourse levels, which may be captured by deep language models such as BERT (Devlin et al., 2019). In this paper, we developed several variants of BERT-based temporal dependency parser, and show that BERT significantly improves temporal dependency parsing (Zhang and Xue,2018a). Source code and trained models will be made available at github.com.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/02/2018

Neural Ranking Models for Temporal Dependency Structure Parsing

We design and build the first neural temporal dependency parser. It util...
research
01/13/2019

Passage Re-ranking with BERT

Recently, neural models pretrained on a language modeling task, such as ...
research
02/16/2023

Syntactic Structure Processing in the Brain while Listening

Syntactic parsing is the task of assigning a syntactic structure to a se...
research
10/31/2019

LIMIT-BERT : Linguistic Informed Multi-Task BERT

In this paper, we present a Linguistic Informed Multi-Task BERT (LIMIT-B...
research
10/15/2022

A Simple and Strong Baseline for End-to-End Neural RST-style Discourse Parsing

To promote and further develop RST-style discourse parsing models, we ne...
research
07/31/2020

Interactive Text Graph Mining with a Prolog-based Dialog Engine

On top of a neural network-based dependency parser and a graph-based nat...
research
06/08/2021

A Modest Pareto Optimisation Analysis of Dependency Parsers in 2021

We evaluate three leading dependency parser systems from different parad...

Please sign up or login with your details

Forgot password? Click here to reset