🤖 AI Summary
To address model bias arising from domain shift and class imbalance in clinical image grading, this paper proposes the Uncertainty-aware Multi-expert Knowledge Distillation (UMKD) framework. Methodologically, UMKD introduces an Uncertainty-driven Decoupled Distillation (UDD) mechanism—the first to jointly decouple task-agnostic and task-specific features—thereby concurrently mitigating expert heterogeneity, source-target distribution discrepancy, and class imbalance. The framework integrates multi-expert ensemble distillation, shallow compact feature alignment, uncertainty-weighted output distillation, and decoupled representation learning. Evaluated on SICAPv2 (prostate histopathology grading) and APTOS (fundus image grading) under source-target imbalanced settings, UMKD achieves new state-of-the-art performance. It significantly enhances model robustness, generalization capability, and reliability for clinical deployment, demonstrating superior adaptability to real-world distributional shifts and label scarcity.
📝 Abstract
Automatic disease image grading is a significant application of artificial intelligence for healthcare, enabling faster and more accurate patient assessments. However, domain shifts, which are exacerbated by data imbalance, introduce bias into the model, posing deployment difficulties in clinical applications. To address the problem, we propose a novel extbf{U}ncertainty-aware extbf{M}ulti-experts extbf{K}nowledge extbf{D}istillation (UMKD) framework to transfer knowledge from multiple expert models to a single student model. Specifically, to extract discriminative features, UMKD decouples task-agnostic and task-specific features with shallow and compact feature alignment in the feature space. At the output space, an uncertainty-aware decoupled distillation (UDD) mechanism dynamically adjusts knowledge transfer weights based on expert model uncertainties, ensuring robust and reliable distillation. Additionally, UMKD also tackles the problems of model architecture heterogeneity and distribution discrepancies between source and target domains, which are inadequately tackled by previous KD approaches. Extensive experiments on histology prostate grading ( extit{SICAPv2}) and fundus image grading ( extit{APTOS}) demonstrate that UMKD achieves a new state-of-the-art in both source-imbalanced and target-imbalanced scenarios, offering a robust and practical solution for real-world disease image grading.