Uncertainty-Aware Multi-Expert Knowledge Distillation for Imbalanced Disease Grading

📅 2025-05-01
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address model bias arising from domain shift and class imbalance in clinical image grading, this paper proposes the Uncertainty-aware Multi-expert Knowledge Distillation (UMKD) framework. Methodologically, UMKD introduces an Uncertainty-driven Decoupled Distillation (UDD) mechanism—the first to jointly decouple task-agnostic and task-specific features—thereby concurrently mitigating expert heterogeneity, source-target distribution discrepancy, and class imbalance. The framework integrates multi-expert ensemble distillation, shallow compact feature alignment, uncertainty-weighted output distillation, and decoupled representation learning. Evaluated on SICAPv2 (prostate histopathology grading) and APTOS (fundus image grading) under source-target imbalanced settings, UMKD achieves new state-of-the-art performance. It significantly enhances model robustness, generalization capability, and reliability for clinical deployment, demonstrating superior adaptability to real-world distributional shifts and label scarcity.

Technology Category

Application Category

📝 Abstract
Automatic disease image grading is a significant application of artificial intelligence for healthcare, enabling faster and more accurate patient assessments. However, domain shifts, which are exacerbated by data imbalance, introduce bias into the model, posing deployment difficulties in clinical applications. To address the problem, we propose a novel extbf{U}ncertainty-aware extbf{M}ulti-experts extbf{K}nowledge extbf{D}istillation (UMKD) framework to transfer knowledge from multiple expert models to a single student model. Specifically, to extract discriminative features, UMKD decouples task-agnostic and task-specific features with shallow and compact feature alignment in the feature space. At the output space, an uncertainty-aware decoupled distillation (UDD) mechanism dynamically adjusts knowledge transfer weights based on expert model uncertainties, ensuring robust and reliable distillation. Additionally, UMKD also tackles the problems of model architecture heterogeneity and distribution discrepancies between source and target domains, which are inadequately tackled by previous KD approaches. Extensive experiments on histology prostate grading ( extit{SICAPv2}) and fundus image grading ( extit{APTOS}) demonstrate that UMKD achieves a new state-of-the-art in both source-imbalanced and target-imbalanced scenarios, offering a robust and practical solution for real-world disease image grading.
Problem

Research questions and friction points this paper is trying to address.

Addresses data imbalance in disease image grading
Proposes uncertainty-aware multi-expert knowledge distillation
Handles domain shifts and model heterogeneity
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uncertainty-aware multi-expert knowledge distillation framework
Decouples task-agnostic and task-specific features
Dynamically adjusts knowledge transfer weights
🔎 Similar Papers
No similar papers found.
S
Shuo Tong
School of Public Health, Zhejiang University
S
Shangde Gao
College of Computer Science and Technology, Zhejiang University
K
Ke Liu
Liangzhu Laboratory and WeDoctor Cloud
Zihang Huang
Zihang Huang
PhD student, Huazhong University of Science and Technology
medical image analysisdeep learningdigital image forensics
Hongxia Xu
Hongxia Xu
Zhejiang University
AI4ScienceNanomedicineMedical imaging
H
Haochao Ying
State Key Laboratory of Transvascular Implantation Devices, The Second Affiliated Hospital Zhejiang University School of Medicine
J
Jian Wu
Zhejiang Key Laboratory of Medical Imaging Artificial Intelligence