DeepAI AI Chat
Log In Sign Up

Undesirable biases in NLP: Averting a crisis of measurement

by   Oskar van der Wal, et al.

As Natural Language Processing (NLP) technology rapidly develops and spreads into daily life, it becomes crucial to anticipate how its use could harm people. However, our ways of assessing the biases of NLP models have not kept up. While especially the detection of English gender bias in such models has enjoyed increasing research attention, many of the measures face serious problems, as it is often unclear what they actually measure and how much they are subject to measurement error. In this paper, we provide an interdisciplinary approach to discussing the issue of NLP model bias by adopting the lens of psychometrics – a field specialized in the measurement of concepts like bias that are not directly observable. We pair an introduction of relevant psychometric concepts with a discussion of how they could be used to evaluate and improve bias measures. We also argue that adopting psychometric vocabulary and methodology can make NLP bias research more efficient and transparent.


page 1

page 2

page 3

page 4


Theories of "Gender" in NLP Bias Research

The rise of concern around Natural Language Processing (NLP) technologie...

[Re] Badder Seeds: Reproducing the Evaluation of Lexical Methods for Bias Measurement

Combating bias in NLP requires bias measurement. Bias measurement is alm...

What do Bias Measures Measure?

Natural Language Processing (NLP) models propagate social biases about p...

Trustworthy Social Bias Measurement

How do we design measures of social bias that we trust? While prior work...

A Two-Sided Discussion of Preregistration of NLP Research

Van Miltenburg et al. (2021) suggest NLP research should adopt preregist...

Managing Bias in Human-Annotated Data: Moving Beyond Bias Removal

Due to the widespread use of data-powered systems in our everyday lives,...

Evaluation Evaluation a Monte Carlo study

Over the last decade there has been increasing concern about the biases ...