Laypeople's Attitudes Towards Fair, Affirmative, and Discriminatory Decision-Making Algorithms

📅 2025-05-12
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study investigates public attitudes toward three types of decision-making algorithms—fair (group-neutral), discriminatory (favoring privileged groups), and ameliorative (favoring historically marginalized groups)—in hiring and judicial contexts. Method: Two behavioral experiments (N = 1,193) employed surveys and multivariate statistical analyses, including tests of interaction effects, to assess attitudinal responses across political ideology and racial identity. Contribution/Results: Participants broadly endorsed fair algorithms and rejected discriminatory ones; however, attitudes toward ameliorative algorithms diverged sharply: liberals and racial minorities evaluated them similarly to fair algorithms, whereas conservatives and majority-race participants perceived them as equivalent to discriminatory algorithms. Crucially, this attitudinal split is rooted in subjective perceptions of *who is marginalized*, with political orientation and racial identity systematically moderating the cognition–attitude linkage. This work provides the first empirical evidence of this perceptual mechanism and offers actionable insights for designing socially acceptable algorithmic justice frameworks.

Technology Category

Application Category

📝 Abstract
Affirmative algorithms have emerged as a potential answer to algorithmic discrimination, seeking to redress past harms and rectify the source of historical injustices. We present the results of two experiments ($N$$=$$1193$) capturing laypeople's perceptions of affirmative algorithms -- those which explicitly prioritize the historically marginalized -- in hiring and criminal justice. We contrast these opinions about affirmative algorithms with folk attitudes towards algorithms that prioritize the privileged (i.e., discriminatory) and systems that make decisions independently of demographic groups (i.e., fair). We find that people -- regardless of their political leaning and identity -- view fair algorithms favorably and denounce discriminatory systems. In contrast, we identify disagreements concerning affirmative algorithms: liberals and racial minorities rate affirmative systems as positively as their fair counterparts, whereas conservatives and those from the dominant racial group evaluate affirmative algorithms as negatively as discriminatory systems. We identify a source of these divisions: people have varying beliefs about who (if anyone) is marginalized, shaping their views of affirmative algorithms. We discuss the possibility of bridging these disagreements to bring people together towards affirmative algorithms.
Problem

Research questions and friction points this paper is trying to address.

Examining laypeople's perceptions of affirmative algorithms in hiring and justice
Comparing attitudes towards fair, affirmative, and discriminatory decision-making algorithms
Investigating disagreements on affirmative algorithms based on political and racial identity
Innovation

Methods, ideas, or system contributions that make the work stand out.

Affirmative algorithms prioritize historically marginalized groups
Contrasts fair, affirmative, and discriminatory algorithmic approaches
Measures laypeople's perceptions across political and identity groups
🔎 Similar Papers
No similar papers found.