ChatGPT Perpetuates Gender Bias in Machine Translation and Ignores Non-Gendered Pronouns: Findings across Bengali and Five other Low-Resource Languages

05/17/2023
by   Sourojit Ghosh, et al.
0

In this multicultural age, language translation is one of the most performed tasks, and it is becoming increasingly AI-moderated and automated. As a novel AI system, ChatGPT claims to be proficient in such translation tasks and in this paper, we put that claim to the test. Specifically, we examine ChatGPT's accuracy in translating between English and languages that exclusively use gender-neutral pronouns. We center this study around Bengali, the 7^th most spoken language globally, but also generalize our findings across five other languages: Farsi, Malay, Tagalog, Thai, and Turkish. We find that ChatGPT perpetuates gender defaults and stereotypes assigned to certain occupations (e.g. man = doctor, woman = nurse) or actions (e.g. woman = cook, man = go to work), as it converts gender-neutral pronouns in languages to `he' or `she'. We also observe ChatGPT completely failing to translate the English gender-neutral pronoun `they' into equivalent gender-neutral pronouns in other languages, as it produces translations that are incoherent and incorrect. While it does respect and provide appropriately gender-marked versions of Bengali words when prompted with gender information in English, ChatGPT appears to confer a higher respect to men than to women in the same occupation. We conclude that ChatGPT exhibits the same gender biases which have been demonstrated for tools like Google Translate or MS Translator, as we provide recommendations for a human centered approach for future designers of AIs that perform language translation to better accommodate such low-resource languages.

READ FULL TEXT

page 5

page 11

page 12

research
09/06/2018

Assessing Gender Bias in Machine Translation -- A Case Study with Google Translate

Recently there has been a growing concern about machine bias, where trai...
research
05/12/2022

Mitigating Gender Stereotypes in Hindi and Marathi

As the use of natural language processing increases in our day-to-day li...
research
06/10/2020

Gender in Danger? Evaluating Speech Translation Technology on the MuST-SHE Corpus

Translating from languages without productive grammatical gender like En...
research
04/15/2021

Improving Gender Translation Accuracy with Filtered Self-Training

Targeted evaluations have found that machine translation systems often o...
research
11/12/2020

How to Measure Gender Bias in Machine Translation: Optimal Translators, Multiple Reference Points

In this paper, as a case study, we present a systematic study of gender ...
research
09/10/2019

Human Languages in Source Code: Auto-Translation for Localized Instruction

Computer science education has promised open access around the world, bu...
research
09/11/2019

Getting Gender Right in Neural Machine Translation

Speakers of different languages must attend to and encode strikingly dif...

Please sign up or login with your details

Forgot password? Click here to reset