Rethinking Artificial Intelligence: Algorithmic Bias and Ethical Issues| How Gender and Type of Algorithmic Group Discrimination Influence Ratings of Algorithmic Decision Making

Sonja Utz

Abstract


Algorithms frequently discriminate against certain groups, and people generally reject such unfairness. However, people sometimes display an egocentric bias when choosing between fairness rules. Two online experiments were conducted to explore whether egocentric biases influence the judgment of biased algorithms. In Experiment 1, an unbiased algorithm was compared with an algorithm favoring males and an algorithm favoring married people. Experiment 2 focused only on the first two conditions. Instead of the expected gender difference in the condition in which the algorithm favored males, a gender difference in the unbiased condition was found in both experiments. Women perceived the unfair algorithm as less fair than men did. Women also perceived the algorithm favoring married people as the least fair. Fairness ratings, however, did not directly translate into permissibility ratings. The results show that egocentric biases are subtle and that women take the social context more into account than men do.


Keywords


algorithm acceptance, algorithmic bias, egocentric bias, fairness, permissibility

Full Text:

PDF