Coreference Resolution without Span Representations

01/02/2021
by   Yuval Kirstain, et al.
0

Since the introduction of deep pretrained language models, most task-specific NLP models were reduced to simple lightweight layers. An exception to this trend is the challenging task of coreference resolution, where a sophisticated end-to-end model is appended to a pretrained transformer encoder. While highly effective, the model has a very large memory footprint – primarily due to dynamically-constructed span and span-pair representations – which hinders the processing of complete documents and the ability to train on multiple instances in a single batch. We introduce a lightweight coreference model that removes the dependency on span representations, handcrafted features, and heuristics. Our model performs competitively with the current end-to-end model, while being simpler and more efficient.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/06/2020

A Cross-Task Analysis of Text Span Representations

Many natural language processing (NLP) tasks involve reasoning with text...
research
11/29/2022

End-to-End Neural Discourse Deixis Resolution in Dialogue

We adapt Lee et al.'s (2018) span-based entity coreference model to the ...
research
09/20/2021

Improving Span Representation for Domain-adapted Coreference Resolution

Recent work has shown fine-tuning neural coreference models can produce ...
research
10/09/2022

Deep Span Representations for Named Entity Recognition

Span-based models are one of the most straightforward methods for named ...
research
05/18/2020

Span-ConveRT: Few-shot Span Extraction for Dialog with Pretrained Conversational Representations

We introduce Span-ConveRT, a light-weight model for dialog slot-filling ...
research
05/16/2020

ApplicaAI at SemEval-2020 Task 11: On RoBERTa-CRF, Span CLS and Whether Self-Training Helps Them

This paper presents the winning system for the propaganda Technique Clas...
research
09/09/2021

Word-Level Coreference Resolution

Recent coreference resolution models rely heavily on span representation...

Please sign up or login with your details

Forgot password? Click here to reset