Neural Fair Collaborative Filtering

09/02/2020
by   Rashidul Islam, et al.
0

A growing proportion of human interactions are digitized on social media platforms and subjected to algorithmic decision-making, and it has become increasingly important to ensure fair treatment from these algorithms. In this work, we investigate gender bias in collaborative-filtering recommender systems trained on social media data. We develop neural fair collaborative filtering (NFCF), a practical framework for mitigating gender bias in recommending sensitive items (e.g. jobs, academic concentrations, or courses of study) using a pre-training and fine-tuning approach to neural collaborative filtering, augmented with bias correction techniques. We show the utility of our methods for gender de-biased career and college major recommendations on the MovieLens dataset and a Facebook dataset, respectively, and achieve better performance and fairer behavior than several state-of-the-art models.

READ FULL TEXT
research
03/01/2022

Popularity Bias in Collaborative Filtering-Based Multimedia Recommender Systems

Multimedia recommender systems suggest media items, e.g., songs, (digita...
research
05/14/2012

A Comparative Study of Collaborative Filtering Algorithms

Collaborative filtering is a rapidly advancing research area. Every year...
research
02/21/2023

Managing multi-facet bias in collaborative filtering recommender systems

Due to the extensive growth of information available online, recommender...
research
02/26/2021

History-Augmented Collaborative Filtering for Financial Recommendations

In many businesses, and particularly in finance, the behavior of a clien...
research
01/18/2022

Emergent Instabilities in Algorithmic Feedback Loops

Algorithms that aid human tasks, such as recommendation systems, are ubi...
research
08/30/2022

Extracting Relations Between Sectors

The term "sector" in professional business life is a vague concept since...
research
05/03/2021

Algorithms are not neutral: Bias in collaborative filtering

Discussions of algorithmic bias tend to focus on examples where either t...

Please sign up or login with your details

Forgot password? Click here to reset