DiPietro-Hazari Kappa: A Novel Metric for Assessing Labeling Quality via Annotation

09/17/2022
by   Daniel M. DiPietro, et al.
0

Data is a key component of modern machine learning, but statistics for assessing data label quality remain sparse in literature. Here, we introduce DiPietro-Hazari Kappa, a novel statistical metric for assessing the quality of suggested dataset labels in the context of human annotation. Rooted in the classical Fleiss's Kappa measure of inter-annotator agreement, the DiPietro-Hazari Kappa quantifies the the empirical annotator agreement differential that was attained above random chance. We offer a thorough theoretical examination of Fleiss's Kappa before turning to our derivation of DiPietro-Hazari Kappa. Finally, we conclude with a matrix formulation and set of procedural instructions for easy computational implementation.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/12/2022

Sparse Probability of Agreement

Measuring inter-annotator agreement is important for annotation tasks, b...
research
12/15/2022

Measuring Annotator Agreement Generally across Complex Structured, Multi-object, and Free-text Annotation Tasks

When annotators label data, a key metric for quality assurance is inter-...
research
06/26/2023

Transcending Traditional Boundaries: Leveraging Inter-Annotator Agreement (IAA) for Enhancing Data Management Operations (DMOps)

This paper presents a novel approach of leveraging Inter-Annotator Agree...
research
05/19/2021

When Deep Classifiers Agree: Analyzing Correlations between Learning Order and Image Statistics

Although a plethora of architectural variants for deep classification ha...
research
01/25/2023

Consistency is Key: Disentangling Label Variation in Natural Language Processing with Intra-Annotator Agreement

We commonly use agreement measures to assess the utility of judgements m...
research
07/24/2019

Investigating Correlations of Inter-coder Agreement and Machine Annotation Performance for Historical Video Data

Video indexing approaches such as visual concept classification and pers...
research
03/07/2018

Sklar's Omega: A Gaussian Copula-Based Framework for Assessing Agreement

The statistical measurement of agreement is important in a number of fie...

Please sign up or login with your details

Forgot password? Click here to reset