Coreference Resolution without Span Representations

01/02/2021 ∙ by Yuval Kirstain, et al. ∙ 0

Since the introduction of deep pretrained language models, most task-specific NLP models were reduced to simple lightweight layers. An exception to this trend is the challenging task of coreference resolution, where a sophisticated end-to-end model is appended to a pretrained transformer encoder. While highly effective, the model has a very large memory footprint – primarily due to dynamically-constructed span and span-pair representations – which hinders the processing of complete documents and the ability to train on multiple instances in a single batch. We introduce a lightweight coreference model that removes the dependency on span representations, handcrafted features, and heuristics. Our model performs competitively with the current end-to-end model, while being simpler and more efficient.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.