Training with Streaming Annotation

02/11/2020
by   Tongtao Zhang, et al.
0

In this paper, we address a practical scenario where training data is released in a sequence of small-scale batches and annotation in earlier phases has lower quality than the later counterparts. To tackle the situation, we utilize a pre-trained transformer network to preserve and integrate the most salient document information from the earlier batches while focusing on the annotation (presumably with higher quality) from the current batch. Using event extraction as a case study, we demonstrate in the experiments that our proposed framework can perform better than conventional approaches (the improvement ranges from 3.6 to 14.9 noise in the early annotation; and our approach spares 19.1 to the best conventional method.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/15/2023

Quality and Efficiency of Manual Annotation: Pre-annotation Bias

This paper presents an analysis of annotation using an automatic pre-ann...
research
09/14/2018

Events Beyond ACE: Curated Training for Events

We explore a human-driven approach to annotation, curated training (CT),...
research
10/26/2022

RBP-DIP: High-Quality CT Reconstruction Using an Untrained Neural Network with Residual Back Projection and Deep Image Prior

Neural network related methods, due to their unprecedented success in im...
research
04/20/2018

A Multi-Axis Annotation Scheme for Event Temporal Relations

Existing temporal relation (TempRel) annotation schemes often have low i...
research
06/06/2023

How Good is the Model in Model-in-the-loop Event Coreference Resolution Annotation?

Annotating cross-document event coreference links is a time-consuming an...
research
05/06/2023

Annotation-efficient learning for OCT segmentation

Deep learning has been successfully applied to OCT segmentation. However...
research
05/13/2021

Thematic fit bits: Annotation quality and quantity for event participant representation

Modeling thematic fit (a verb–argument compositional semantics task) cur...

Please sign up or login with your details

Forgot password? Click here to reset