A Fairness-aware Hybrid Recommender System

09/13/2018
by   Golnoosh Farnadi, et al.
University of California Santa Cruz
0

Recommender systems are used in variety of domains affecting people's lives. This has raised concerns about possible biases and discrimination that such systems might exacerbate. There are two primary kinds of biases inherent in recommender systems: observation bias and bias stemming from imbalanced data. Observation bias exists due to a feedback loop which causes the model to learn to only predict recommendations similar to previous ones. Imbalance in data occurs when systematic societal, historical, or other ambient bias is present in the data. In this paper, we address both biases by proposing a hybrid fairness-aware recommender system. Our model provides efficient and accurate recommendations by incorporating multiple user-user and item-item similarity measures, content, and demographic information, while addressing recommendation biases. We implement our model using a powerful and expressive probabilistic programming language called probabilistic soft logic. We experimentally evaluate our approach on a popular movie recommendation dataset, showing that our proposed model can provide more accurate and fairer recommendations, compared to a state-of-the art fair recommender system.

READ FULL TEXT
12/13/2022

FairRoad: Achieving Fairness for Recommender Systems with Optimized Antidote Data

Today, recommender systems have played an increasingly important role in...
09/05/2020

HyperFair: A Soft Approach to Integrating Fairness Criteria

Recommender systems are being employed across an increasingly diverse se...
12/20/2020

Towards Fair Personalization by Avoiding Feedback Loops

Self-reinforcing feedback loops are both cause and effect of over and/or...
09/08/2023

Provider Fairness and Beyond-Accuracy Trade-offs in Recommender Systems

Recommender systems, while transformative in online user experiences, ha...
05/22/2021

Deconfounded Recommendation for Alleviating Bias Amplification

Recommender systems usually amplify the biases in the data. The model le...
03/05/2018

Optimizing Slate Recommendations via Slate-CVAE

The slate recommendation problem aims to find the "optimal" ordering of ...
01/30/2021

When the Umpire is also a Player: Bias in Private Label Product Recommendations on E-commerce Marketplaces

Algorithmic recommendations mediate interactions between millions of cus...

Please sign up or login with your details

Forgot password? Click here to reset