Cross-replication Reliability – An Empirical Approach to Interpreting Inter-rater Reliability

06/11/2021
by   Ka Wong, et al.
0

We present a new approach to interpreting IRR that is empirical and contextualized. It is based upon benchmarking IRR against baseline measures in a replication, one of which is a novel cross-replication reliability (xRR) measure based on Cohen's kappa. We call this approach the xRR framework. We opensource a replication dataset of 4 million human judgements of facial expressions and analyze it with the proposed framework. We argue this framework can be used to measure the quality of crowdsourced datasets.

READ FULL TEXT

page 7

page 13

research
11/26/2018

A New Standard for the Analysis and Design of Replication Studies

A new standard is proposed for the evidential assessment of replication ...
research
01/09/2020

Hybrid Coded Replication in LoRa Networks

Low Power Wide Area Networks (LPWAN) are wireless connectivity solutions...
research
05/19/2020

Identifying Statistical Bias in Dataset Replication

Dataset replication is a useful tool for assessing whether improvements ...
research
06/30/2022

Bio-inspired Machine Learning: programmed death and replication

We analyze algorithmic and computational aspects of biological phenomena...
research
01/15/2018

Conceptualizing and Evaluating Replication Across Domains of Behavioral Research

We discuss the authors' conceptualization of replication, in particular ...
research
09/13/2023

Mitigate Replication and Copying in Diffusion Models with Generalized Caption and Dual Fusion Enhancement

While diffusion models demonstrate a remarkable capability for generatin...
research
05/18/2018

A Self-Replication Basis for Designing Complex Agents

In this work, we describe a self-replication-based mechanism for designi...

Please sign up or login with your details

Forgot password? Click here to reset