Incorporating Rather Than Eliminating: Achieving Fairness for Skin Disease Diagnosis Through Group-Specific Expert

📅 2025-06-21
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
AI diagnostic models for skin diseases often suffer from population-level bias, leading to a trade-off between fairness and accuracy. Conventional debiasing approaches—by removing correlations with sensitive attributes—risk discarding clinically relevant features, thereby degrading performance. To address this, we propose FairMoE, a novel framework that treats sensitive attributes as modeling guidance rather than nuisance variables. FairMoE employs a hierarchical Mixture-of-Experts (MoE) architecture with dynamic routing, enabling group-specific learning and adaptive assignment of boundary samples. Crucially, it preserves diagnostically discriminative features while explicitly modeling population heterogeneity. Experiments demonstrate that FairMoE achieves fairness metrics—including equal opportunity difference and predictive equality—comparable to or better than state-of-the-art baselines, while significantly improving overall diagnostic accuracy. Thus, FairMoE effectively breaks the conventional fairness–performance trade-off.

Technology Category

Application Category

📝 Abstract
AI-based systems have achieved high accuracy in skin disease diagnostics but often exhibit biases across demographic groups, leading to inequitable healthcare outcomes and diminished patient trust. Most existing bias mitigation methods attempt to eliminate the correlation between sensitive attributes and diagnostic prediction, but those methods often degrade performance due to the lost of clinically relevant diagnostic cues. In this work, we propose an alternative approach that incorporates sensitive attributes to achieve fairness. We introduce FairMoE, a framework that employs layer-wise mixture-of-experts modules to serve as group-specific learners. Unlike traditional methods that rigidly assign data based on group labels, FairMoE dynamically routes data to the most suitable expert, making it particularly effective for handling cases near group boundaries. Experimental results show that, unlike previous fairness approaches that reduce performance, FairMoE achieves substantial accuracy improvements while preserving comparable fairness metrics.
Problem

Research questions and friction points this paper is trying to address.

AI skin diagnosis exhibits biases across demographic groups
Existing bias mitigation methods degrade diagnostic performance
FairMoE dynamically routes data to improve accuracy and fairness
Innovation

Methods, ideas, or system contributions that make the work stand out.

Incorporates sensitive attributes for fairness
Uses group-specific expert learners
Dynamically routes data to experts
🔎 Similar Papers
No similar papers found.
Gelei Xu
Gelei Xu
University of Notre Dame
Yuying Duan
Yuying Duan
University of Notre Dame
Federated Learning
Z
Zheyuan Liu
University of Notre Dame, Notre Dame, IN 46556, USA
Xueyang Li
Xueyang Li
University of Notre Dame
Medical Image
M
Meng Jiang
University of Notre Dame, Notre Dame, IN 46556, USA
M
Michael Lemmon
University of Notre Dame, Notre Dame, IN 46556, USA
W
Wei Jin
Emory University, Atlanta, GA 30322, USA
Yiyu Shi
Yiyu Shi
Full Professor, University of Notre Dame
hardware/software co-designdeep learning accelerationon-device AIAI for healthcare