Facebook’s Ad Delivery System and Gender Bias: Insights from a New Study
A recent study has shed light on Facebook’s ad delivery system, suggesting that it exhibits bias towards gender by presenting different job listings to women compared to men.
Unveiling the Bias
Conducted by researchers at the University of Southern California, the study discovered that Facebook’s ad delivery mechanism demonstrates discrimination against women. The study, as reported by The Verge, revealed that women are shown distinct job listings compared to men.
The researchers conducted an experiment by purchasing ads on Facebook for delivery driver positions. These job listings had similar qualification requirements but were for different companies. The results indicated that Facebook’s algorithm directed the Instacart delivery job ads more towards women, while the Domino’s delivery job ads were targeted more towards men.
The disparity was attributed to the fact that Instacart typically employs more female drivers, whereas Domino’s has a higher proportion of male drivers. The researchers emphasized that such bias in ad delivery could lead to gender skew beyond justifiable differences in qualifications, possibly infringing upon anti-discrimination laws.
Insights from LinkedIn
In a parallel experiment on LinkedIn, which is owned by Microsoft, the researchers observed that the professional networking platform displayed the Domino’s listing to an equal number of women as it did the Instacart ad.
Facebook’s Response
Responding to these findings, a Facebook spokesperson acknowledged that their ad delivery system considers multiple signals to present ads that align with users’ interests. However, they acknowledged the concerns raised in the study and mentioned ongoing efforts to address issues of discrimination in ads.
Facebook has faced previous allegations of bias in its algorithms. In 2017, investigations revealed that companies like Verizon, Amazon, Goldman Sachs, Target, and Facebook limited recruitment ads to specific age groups. Another probe highlighted Facebook’s allowance of housing advertisers to target audiences by race, potentially violating fair housing rules.
While Facebook attributed some incidents to technical failures, these instances underline the importance of continually evaluating and refining algorithms to ensure fairness and compliance with legal and ethical standards.