A Comprehensive Evaluation of the Sensitivity of Density-Ratio Estimation Based Fairness Measurement in Regression

📅 2025-08-20
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses the critical issue of high sensitivity to density ratio estimation (DRE) algorithm choice in regression fairness metrics—leading to inconsistent and unreliable fairness assessments. We systematically evaluate multiple mainstream DRE methods—including logistic regression classifiers, KLIEP, and uLSIF—within the context of fairness quantification. Through empirical analysis, we reveal for the first time that different DRE backbones can substantially alter fairness scores and even reverse the relative fairness ranking among models, exposing a fundamental lack of robustness in existing DRE-based fairness measures. To address this, we propose a principled evaluation framework and diagnostic criteria for assessing DRE-based fairness estimators. Our work provides both theoretical foundations and practical guidelines for developing robust, reproducible regression fairness metrics.

Technology Category

Application Category

📝 Abstract
The prevalence of algorithmic bias in Machine Learning (ML)-driven approaches has inspired growing research on measuring and mitigating bias in the ML domain. Accordingly, prior research studied how to measure fairness in regression which is a complex problem. In particular, recent research proposed to formulate it as a density-ratio estimation problem and relied on a Logistic Regression-driven probabilistic classifier-based approach to solve it. However, there are several other methods to estimate a density ratio, and to the best of our knowledge, prior work did not study the sensitivity of such fairness measurement methods to the choice of underlying density ratio estimation algorithm. To fill this gap, this paper develops a set of fairness measurement methods with various density-ratio estimation cores and thoroughly investigates how different cores would affect the achieved level of fairness. Our experimental results show that the choice of density-ratio estimation core could significantly affect the outcome of fairness measurement method, and even, generate inconsistent results with respect to the relative fairness of various algorithms. These observations suggest major issues with density-ratio estimation based fairness measurement in regression and a need for further research to enhance their reliability.
Problem

Research questions and friction points this paper is trying to address.

Evaluates sensitivity of density-ratio estimation methods for fairness measurement
Investigates how different density-ratio cores affect regression fairness outcomes
Identifies inconsistent results from algorithmic choices in fairness assessment
Innovation

Methods, ideas, or system contributions that make the work stand out.

Evaluated sensitivity of density-ratio estimation fairness methods
Developed multiple density-ratio estimation core approaches
Compared different cores' impact on fairness measurement outcomes
🔎 Similar Papers
No similar papers found.
A
Abdalwahab Almajed
Imam Abdulrahman Bin Faisal University, Dammam, Saudi Arabia
Maryam Tabar
Maryam Tabar
University of Texas at San Antonio
Machine LearningData Science for Social Good
P
Peyman Najafirad
The University of Texas at San Antonio, San Antonio, Texas, USA