Revealing the Myth of Higher-Order Inference in Coreference Resolution

09/25/2020
by   Liyan Xu, et al.
0

This paper analyzes the impact of higher-order inference (HOI) on the task of coreference resolution. HOI has been adapted by almost all recent coreference resolution models without taking much investigation on its true effectiveness over representation learning. To make a comprehensive analysis, we implement an end-to-end coreference system as well as four HOI approaches, attended antecedent, entity equalization, span clustering, and cluster merging, where the latter two are our original methods. We find that given a high-performing encoder such as SpanBERT, the impact of HOI is negative to marginal, providing a new perspective of HOI to this task. Our best model using cluster merging shows the Avg-F1 of 80.2 on the CoNLL 2012 shared task dataset in English.

READ FULL TEXT

page 1

page 2

page 3

page 4

04/12/2017

Higher-order clustering in networks

A fundamental property of complex networks is the tendency for edges to ...
09/01/2021

Adapted End-to-End Coreference Resolution System for Anaphoric Identities in Dialogues

We present an effective system adapted from the end-to-end neural corefe...
05/13/2018

Neural Coreference Resolution with Deep Biaffine Attention by Joint Mention Detection and Mention Clustering

Coreference resolution aims to identify in a text all mentions that refe...
04/15/2018

Higher-order Coreference Resolution with Coarse-to-fine Inference

We introduce a fully differentiable approximation to higher-order infere...
07/04/2021

End-to-end Neural Coreference Resolution Revisited: A Simple yet Effective Baseline

Since the first end-to-end neural coreference resolution model was intro...
01/16/2014

Narrowing the Modeling Gap: A Cluster-Ranking Approach to Coreference Resolution

Traditional learning-based coreference resolvers operate by training the...
01/04/2021

Are Eliminated Spans Useless for Coreference Resolution? Not at all

Various neural-based methods have been proposed so far for joint mention...