A Causal Framework to Measure and Mitigate Non-binary Treatment Discrimination

📅 2025-03-28
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Prior fairness research has largely overlooked implicit discrimination arising from non-binary treatment decisions—such as loan terms or bail conditions—despite their critical role in real-world algorithmic decision-making. Method: We propose the first causal fairness framework explicitly centered on non-binary treatments, grounded in structural causal models and counterfactual reasoning. It rigorously separates individual covariates from treatment assignments, enabling interpretable attribution and correction of treatment disparities and their downstream outcomes (e.g., default, recidivism). Contribution/Results: Empirical evaluation across four benchmark lending datasets reveals substantial non-binary treatment discrimination in existing risk scoring systems. Our framework enables causal interventions at the treatment level—adjusting terms rather than merely recalibrating scores—yielding significant improvements in group fairness (e.g., equalized treatment distribution, reduced outcome disparity) without compromising predictive validity. This approach balances stakeholder interests by respecting both decision-maker objectives and affected individuals’ rights.

Technology Category

Application Category

📝 Abstract
Fairness studies of algorithmic decision-making systems often simplify complex decision processes, such as bail or loan approvals, into binary classification tasks. However, these approaches overlook that such decisions are not inherently binary (e.g., approve or not approve bail or loan); they also involve non-binary treatment decisions (e.g., bail conditions or loan terms) that can influence the downstream outcomes (e.g., loan repayment or reoffending). In this paper, we argue that non-binary treatment decisions are integral to the decision process and controlled by decision-makers and, therefore, should be central to fairness analyses in algorithmic decision-making. We propose a causal framework that extends fairness analyses and explicitly distinguishes between decision-subjects' covariates and the treatment decisions. This specification allows decision-makers to use our framework to (i) measure treatment disparity and its downstream effects in historical data and, using counterfactual reasoning, (ii) mitigate the impact of past unfair treatment decisions when automating decision-making. We use our framework to empirically analyze four widely used loan approval datasets to reveal potential disparity in non-binary treatment decisions and their discriminatory impact on outcomes, highlighting the need to incorporate treatment decisions in fairness assessments. Moreover, by intervening in treatment decisions, we show that our framework effectively mitigates treatment discrimination from historical data to ensure fair risk score estimation and (non-binary) decision-making processes that benefit all stakeholders.
Problem

Research questions and friction points this paper is trying to address.

Measure non-binary treatment discrimination in algorithmic decisions
Mitigate unfair treatment impact using causal framework
Analyze loan datasets for treatment disparity effects
Innovation

Methods, ideas, or system contributions that make the work stand out.

Extends fairness analyses with causal framework
Measures treatment disparity using counterfactual reasoning
Mitigates unfair treatment in automated decisions
🔎 Similar Papers
No similar papers found.