Racial Disparity in Natural Language Processing: A Case Study of Social Media African-American English

06/30/2017
by   Su Lin Blodgett, et al.
0

We highlight an important frontier in algorithmic fairness: disparity in the quality of natural language processing algorithms when applied to language from authors of different social groups. For example, current systems sometimes analyze the language of females and minorities more poorly than they do of whites and males. We conduct an empirical analysis of racial disparity in language identification for tweets written in African-American English, and discuss implications of disparity in NLP.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/31/2016

Demographic Dialectal Variation in Social Media: A Case Study of African-American English

Though dialectal language is increasingly abundant on social media, few ...
research
04/12/2022

Robust Quantification of Gender Disparity in Pre-Modern English Literature using Natural Language Processing

Research has continued to shed light on the extent and significance of g...
research
09/16/2017

Data Innovation for International Development: An overview of natural language processing for qualitative data analysis

Availability, collection and access to quantitative data, as well as its...
research
08/02/2023

Manual Tests Do Smell! Cataloging and Identifying Natural Language Test Smells

Background: Test smells indicate potential problems in the design and im...
research
04/04/2023

Rumour Detection and Analysis on Twitter

In recent years people have become increasingly reliant on social media ...
research
02/24/2021

Understanding and Mitigating Accuracy Disparity in Regression

With the widespread deployment of large-scale prediction systems in high...
research
03/22/2022

Racial Disparities in the Enforcement of Marijuana Violations in the US

Racial disparities in US drug arrest rates have been observed for decade...

Please sign up or login with your details

Forgot password? Click here to reset