DeepAI
Log In Sign Up

Gender Coreference and Bias Evaluation at WMT 2020

10/12/2020
by   Tom Kocmi, et al.
0

Gender bias in machine translation can manifest when choosing gender inflections based on spurious gender correlations. For example, always translating doctors as men and nurses as women. This can be particularly harmful as models become more popular and deployed within commercial systems. Our work presents the largest evidence for the phenomenon in more than 19 systems submitted to the WMT over four diverse target languages: Czech, German, Polish, and Russian. To achieve this, we use WinoMT, a recent automatic test suite which examines gender coreference and bias when translating from English to languages with grammatical gender. We extend WinoMT to handle two new languages tested in WMT: Polish and Czech. We find that all systems consistently use spurious correlations in the data rather than meaningful contextual information.

READ FULL TEXT

page 1

page 2

page 3

page 4

06/03/2019

Evaluating Gender Bias in Machine Translation

We present the first challenge set and evaluation protocol for the analy...
05/13/2022

Analyzing Hate Speech Data along Racial, Gender and Intersectional Axes

To tackle the rising phenomenon of hate speech, efforts have been made t...
09/24/2020

Type B Reflexivization as an Unambiguous Testbed for Multilingual Multi-Task Gender Bias

The one-sided focus on English in previous studies of gender bias in NLP...
09/06/2018

Assessing Gender Bias in Machine Translation -- A Case Study with Google Translate

Recently there has been a growing concern about machine bias, where trai...
09/08/2021

Collecting a Large-Scale Gender Bias Dataset for Coreference Resolution and Machine Translation

Recent works have found evidence of gender bias in models of machine tra...
06/10/2020

Gender in Danger? Evaluating Speech Translation Technology on the MuST-SHE Corpus

Translating from languages without productive grammatical gender like En...
05/30/2022

Gender Bias in Password Managers

For the first time, we report gender bias in people's choice and use of ...