Algorithms that "Don't See Color": Comparing Biases in Lookalike and Special Ad Audiences

12/16/2019
by   Piotr Sapiezynski, et al.
0

Today, algorithmic models are shaping important decisions in domains such as credit, employment, or criminal justice. At the same time, these algorithms have been shown to have discriminatory effects. Some organizations have tried to mitigate these effects by removing demographic features from an algorithm's inputs. If an algorithm is not provided with a feature, one might think, then its outputs should not discriminate with respect to that feature. This may not be true, however, when there are other correlated features. In this paper, we explore the limits of this approach using a unique opportunity created by a lawsuit settlement concerning discrimination on Facebook's advertising platform. Facebook agreed to modify its Lookalike Audiences tool - which creates target sets of users for ads by identifying users who share "common qualities" with users in a source audience provided by an advertiser - by removing certain demographic features as inputs to its algorithm. The modified tool, Special Ad Audiences, is intended to reduce the potential for discrimination in target audiences. We create a series of Lookalike and Special Ad audiences based on biased source audiences - i.e., source audiences that have known skew along the lines of gender, age, race, and political leanings. We show that the resulting Lookalike and Special Ad audiences both reflect these biases, despite the fact that Special Ad Audiences algorithm is not provided with the features along which our source audiences are skewed. More broadly, we provide experimental proof that removing demographic features from a real-world algorithmic system's inputs can fail to prevent biased outputs. Organizations using algorithms to mediate access to life opportunities should consider other approaches to mitigating discriminatory effects.

READ FULL TEXT

page 3

page 6

page 7

research
08/21/2020

Auditing Digital Platforms for Discrimination in Economic Opportunity Advertising

Digital platforms, including social networks, are major sources of econo...
research
04/03/2019

Discrimination through optimization: How Facebook's ad delivery can lead to skewed outcomes

The enormous financial success of online advertising platforms is partia...
research
12/09/2019

Ad Delivery Algorithms: The Hidden Arbiters of Political Messaging

Political campaigns are increasingly turning to digital advertising to r...
research
04/09/2021

Auditing for Discrimination in Algorithms Delivering Job Ads

Ad platforms such as Facebook, Google and LinkedIn promise value for adv...
research
06/13/2023

Discrimination through Image Selection by Job Advertisers on Facebook

Targeted advertising platforms are widely used by job advertisers to rea...
research
12/04/2020

Biased Programmers? Or Biased Data? A Field Experiment in Operationalizing AI Ethics

Why do biased predictions arise? What interventions can prevent them? We...
research
03/01/2023

Mixture of regressions with multivariate responses for discovering subtypes in Alzheimer's biomarkers with detection limits

There is no gold standard for the diagnosis of Alzheimer's disease (AD),...

Please sign up or login with your details

Forgot password? Click here to reset