Gender Bias in Coreference Resolution: Evaluation and Debiasing Methods

04/18/2018
by   Jieyu Zhao, et al.
0

We introduce a new benchmark, WinoBias, for coreference resolution focused on gender bias. Our corpus contains Winograd-schema style sentences with entities corresponding to people referred by their occupation (e.g. the nurse, the doctor, the carpenter). We demonstrate that a rule-based, a feature-rich, and a neural coreference system all link gendered pronouns to pro-stereotypical entities with higher accuracy than anti-stereotypical entities, by an average difference of 21.1 in F1 score. Finally, we demonstrate a data-augmentation approach that, in combination with existing word-embedding debiasing techniques, removes the bias demonstrated by these systems in WinoBias without significantly affecting their performance on existing coreference benchmark datasets. Our dataset and code are available at http://winobias.org.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/25/2018

Gender Bias in Coreference Resolution

We present an empirical study of gender bias in coreference resolution s...
research
11/02/2018

The Hard-CoRe Coreference Corpus: Removing Gender and Number Cues for Difficult Pronominal Anaphora Resolution

We introduce a new benchmark task for coreference resolution, Hard-CoRe,...
research
05/30/2019

Reducing Gender Bias in Word-Level Language Models with a Gender-Equalizing Loss Function

Gender bias exists in natural language datasets which neural language mo...
research
09/08/2021

Collecting a Large-Scale Gender Bias Dataset for Coreference Resolution and Machine Translation

Recent works have found evidence of gender bias in models of machine tra...
research
11/22/2019

Anaphora Resolution in Dialogue Systems for South Asian Languages

Anaphora resolution is a challenging task which has been the interest of...
research
08/29/2023

PronounFlow: A Hybrid Approach for Calibrating Pronouns in Sentences

Flip through any book or listen to any song lyrics, and you will come ac...
research
06/12/2023

Gender-Inclusive Grammatical Error Correction through Augmentation

In this paper we show that GEC systems display gender bias related to th...

Please sign up or login with your details

Forgot password? Click here to reset