Rethinking Artificial Intelligence: Algorithmic Bias and Ethical Issues| How Gender and Type of Algorithmic Group Discrimination Influence Ratings of Algorithmic Decision Making

Authors

  • Sonja Utz Leibniz-Institut für Wissensmedien, University of Tübingen

Keywords:

algorithm acceptance, algorithmic bias, egocentric bias, fairness, permissibility

Abstract

Algorithms frequently discriminate against certain groups, and people generally reject such unfairness. However, people sometimes display an egocentric bias when choosing between fairness rules. Two online experiments were conducted to explore whether egocentric biases influence the judgment of biased algorithms. In Experiment 1, an unbiased algorithm was compared with an algorithm favoring males and an algorithm favoring married people. Experiment 2 focused only on the first two conditions. Instead of the expected gender difference in the condition in which the algorithm favored males, a gender difference in the unbiased condition was found in both experiments. Women perceived the unfair algorithm as less fair than men did. Women also perceived the algorithm favoring married people as the least fair. Fairness ratings, however, did not directly translate into permissibility ratings. The results show that egocentric biases are subtle and that women take the social context more into account than men do.

Author Biography

Sonja Utz, Leibniz-Institut für Wissensmedien, University of Tübingen

Head of the Junior Research Group Social Media. Since April 2014, she has also been Professor for communication via social media at the University of Tübingen

Downloads

Published

2023-12-26

Issue

Section

Special Sections