Coreferential Reasoning Learning for Language Representation

04/15/2020
by   Deming Ye, et al.
0

Language representation models such as BERT could effectively capture contextual semantic information from plain text, and have been proved to achieve promising results in lots of downstream NLP tasks with appropriate fine-tuning. However, existing language representation models seldom consider coreference explicitly, the relationship between noun phrases referring to the same entity, which is essential to a coherent understanding of the whole discourse. To address this issue, we present CorefBERT, a novel language representation model designed to capture the relations between noun phrases that co-refer to each other. According to the experimental results, compared with existing baseline models, the CorefBERT model has made significant progress on several downstream NLP tasks that require coreferential reasoning, while maintaining comparable performance to previous models on other common NLP tasks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/17/2019

ERNIE: Enhanced Language Representation with Informative Entities

Neural language representation models such as BERT pre-trained on large-...
research
11/15/2022

When to Use What: An In-Depth Comparative Empirical Analysis of OpenIE Systems for Downstream Applications

Open Information Extraction (OpenIE) has been used in the pipelines of v...
research
03/08/2021

Language Models have a Moral Dimension

Artificial writing is permeating our lives due to recent advances in lar...
research
06/02/2020

A Pairwise Probe for Understanding BERT Fine-Tuning on Machine Reading Comprehension

Pre-trained models have brought significant improvements to many NLP tas...
research
08/18/2018

Learning to Compose over Tree Structures via POS Tags

Recursive Neural Network (RecNN), a type of models which compose words o...
research
12/30/2020

DEER: A Data Efficient Language Model for Event Temporal Reasoning

Pretrained language models (LMs) such as BERT, RoBERTa, and ELECTRA are ...
research
10/18/2021

ViraPart: A Text Refinement Framework for ASR and NLP Tasks in Persian

The Persian language is an inflectional SOV language. This fact makes Pe...

Please sign up or login with your details

Forgot password? Click here to reset