Model-Agnostic Gender Debiased Image Captioning

04/07/2023
by   Yusuke Hirota, et al.
8

Image captioning models are known to perpetuate and amplify harmful societal bias in the training set. In this work, we aim to mitigate such gender bias in image captioning models. While prior work has addressed this problem by forcing models to focus on people to reduce gender misclassification, it conversely generates gender-stereotypical words at the expense of predicting the correct gender. From this observation, we hypothesize that there are two types of gender bias affecting image captioning models: 1) bias that exploits context to predict gender, and 2) bias in the probability of generating certain (often stereotypical) words because of gender. To mitigate both types of gender biases, we propose a framework, called LIBRA, that learns from synthetically biased samples to decrease both types of biases, correcting gender misclassification and changing gender-stereotypical words to more neutral ones.

READ FULL TEXT

page 1

page 6

page 7

page 15

page 16

research
12/02/2019

Exposing and Correcting the Gender Bias in Image Captioning Datasets and Models

The task of image captioning implicitly involves gender identification. ...
research
03/29/2022

Quantifying Societal Bias Amplification in Image Captioning

We study societal bias amplification in image captioning. Image captioni...
research
06/16/2021

Understanding and Evaluating Racial Biases in Image Captioning

Image captioning is an important task for benchmarking visual reasoning ...
research
05/03/2023

Fairness in AI Systems: Mitigating gender bias from language-vision models

Our society is plagued by several biases, including racial biases, caste...
research
03/26/2018

Women also Snowboard: Overcoming Bias in Captioning Models

Most machine learning methods are known to capture and exploit biases of...
research
06/15/2020

Mitigating Gender Bias in Captioning Systems

Image captioning has made substantial progress with huge supporting imag...
research
07/02/2018

Women also Snowboard: Overcoming Bias in Captioning Models (Extended Abstract)

Most machine learning methods are known to capture and exploit biases of...

Please sign up or login with your details

Forgot password? Click here to reset