Wisdom for the Crowd: Discoursive Power in Annotation Instructions for Computer Vision

05/23/2021
by   Milagros Miceli, et al.
0

Developers of computer vision algorithms outsource some of the labor involved in annotating training data through business process outsourcing companies and crowdsourcing platforms. Many data annotators are situated in the Global South and are considered independent contractors. This paper focuses on the experiences of Argentinian and Venezuelan annotation workers. Through qualitative methods, we explore the discourses encoded in the task instructions that these workers follow to annotate computer vision datasets. Our preliminary findings indicate that annotation instructions reflect worldviews imposed on workers and, through their labor, on datasets. Moreover, we observe that for-profit goals drive task instructions and that managers and algorithms make sure annotations are done according to requesters' commands. This configuration presents a form of commodified labor that perpetuates power asymmetries while reinforcing social inequalities and is compelled to reproduce them into datasets and, subsequently, in computer vision systems.

READ FULL TEXT

page 1

page 2

page 3

research
05/24/2022

The Data-Production Dispositif

Machine learning (ML) depends on data to train and verify models. Very o...
research
12/04/2021

In Search of Ambiguity: A Three-Stage Workflow Design to Clarify Annotation Guidelines for Crowd Workers

We propose a novel three-stage FIND-RESOLVE-LABEL workflow for crowdsour...
research
06/30/2023

Situated Cameras, Situated Knowledges: Towards an Egocentric Epistemology for Computer Vision

In her influential 1988 paper, Situated Knowledges, Donna Haraway uses v...
research
04/10/2022

Re-Examining Human Annotations for Interpretable NLP

Explanation methods in Interpretable NLP often explain the model's decis...
research
07/29/2020

Between Subjectivity and Imposition: Power Dynamics in Data Annotation for Computer Vision

The interpretation of data is fundamental to machine learning. This pape...
research
09/03/2023

How Crowd Worker Factors Influence Subjective Annotations: A Study of Tagging Misogynistic Hate Speech in Tweets

Crowdsourced annotation is vital to both collecting labelled data to tra...
research
06/07/2017

Early Experiences with Crowdsourcing Airway Annotations in Chest CT

Measuring airways in chest computed tomography (CT) images is important ...

Please sign up or login with your details

Forgot password? Click here to reset