Uncertainty in Fairness Assessment: Maintaining Stable Conclusions Despite Fluctuations

02/02/2023
by   Ainhize Barrainkua, et al.
0

Several recent works encourage the use of a Bayesian framework when assessing performance and fairness metrics of a classification algorithm in a supervised setting. We propose the Uncertainty Matters (UM) framework that generalizes a Beta-Binomial approach to derive the posterior distribution of any criteria combination, allowing stable performance assessment in a bias-aware setting.We suggest modeling the confusion matrix of each demographic group using a Multinomial distribution updated through a Bayesian procedure. We extend UM to be applicable under the popular K-fold cross-validation procedure. Experiments highlight the benefits of UM over classical evaluation frameworks regarding informativeness and stability.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset