Domain-Expert-Guided Hybrid Mixture-of-Experts for Medical AI: Integrating Data-Driven Learning with Clinical Priors

📅 2026-01-25
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Medical AI is often hindered by data scarcity, which limits the effective training of conventional Mixture-of-Experts (MoE) models and impedes the integration of clinical expertise. To address this challenge, this work proposes the DKGH-MoE module, which, for the first time, incorporates clinical priors—such as physicians’ gaze patterns—into the MoE architecture in a plug-and-play and interpretable manner. The resulting hybrid expert system combines a data-driven network that learns general features with a prior-guided network informed by eye-tracking trajectories to focus on regions of high diagnostic value. Evaluated across multiple medical imaging tasks, the proposed method significantly improves diagnostic performance while enhancing model interpretability, thereby demonstrating the effectiveness and necessity of synergistically integrating domain knowledge with data-driven learning.

Technology Category

Application Category

📝 Abstract
Mixture-of-Experts (MoE) models increase representational capacity with modest computational cost, but their effectiveness in specialized domains such as medicine is limited by small datasets. In contrast, clinical practice offers rich expert knowledge, such as physician gaze patterns and diagnostic heuristics, that models cannot reliably learn from limited data. Combining data-driven experts, which capture novel patterns, with domain-expert-guided experts, which encode accumulated clinical insights, provides complementary strengths for robust and clinically meaningful learning. To this end, we propose Domain-Knowledge-Guided Hybrid MoE (DKGH-MoE), a plug-and-play and interpretable module that unifies data-driven learning with domain expertise. DKGH-MoE integrates a data-driven MoE to extract novel features from raw imaging data, and a domain-expert-guided MoE incorporates clinical priors, specifically clinician eye-gaze cues, to emphasize regions of high diagnostic relevance. By integrating domain expert insights with data-driven features, DKGH-MoE improves both performance and interpretability.
Problem

Research questions and friction points this paper is trying to address.

Mixture-of-Experts
medical AI
small datasets
clinical priors
domain expertise
Innovation

Methods, ideas, or system contributions that make the work stand out.

Mixture-of-Experts
clinical priors
domain knowledge
interpretable AI
eye-gaze cues
🔎 Similar Papers
No similar papers found.
J
Jinchen Gu
Department of Computer Science, Indiana University Indianapolis, Indianapolis, IN, USA
N
Nan Zhao
Department of Computer Science, Indiana University Indianapolis, Indianapolis, IN, USA
L
Lei Qiu
Department of Computer Science, Indiana University Indianapolis, Indianapolis, IN, USA
Lu Zhang
Lu Zhang
Assistant Professor, Indiana University Indianapolis
AI in NeuroscienceBrain Inspired AIBrain DisordersLLMs