Blend the Separated: Mixture of Synergistic Experts for Data-Scarcity Drug-Target Interaction Prediction

📅 2025-03-20
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the dual sparsity challenge in drug–target interaction (DTI) prediction—arising from intrinsic and extrinsic data scarcity for molecules/targets alongside sparse ground-truth labels—we propose a dual-expert collaborative semi-supervised framework. It employs a graph neural network to model molecular structures and multi-view representation learning to capture biological associations, integrating heterogeneous features via an adaptive gating fusion mechanism. Further, we introduce bidirectional mutual supervision and semi-supervised knowledge distillation to jointly exploit unlabeled samples. This work is the first to systematically tackle DTI prediction under simultaneous input and label sparsity. We design a decoupled dual-expert architecture with a learnable fusion mechanism. Extensive experiments on three real-world datasets demonstrate significant improvements over state-of-the-art methods, with up to a 53.53% gain in performance; notably, our method also maintains superior accuracy in non-sparse settings.

Technology Category

Application Category

📝 Abstract
Drug-target interaction prediction (DTI) is essential in various applications including drug discovery and clinical application. There are two perspectives of input data widely used in DTI prediction: Intrinsic data represents how drugs or targets are constructed, and extrinsic data represents how drugs or targets are related to other biological entities. However, any of the two perspectives of input data can be scarce for some drugs or targets, especially for those unpopular or newly discovered. Furthermore, ground-truth labels for specific interaction types can also be scarce. Therefore, we propose the first method to tackle DTI prediction under input data and/or label scarcity. To make our model functional when only one perspective of input data is available, we design two separate experts to process intrinsic and extrinsic data respectively and fuse them adaptively according to different samples. Furthermore, to make the two perspectives complement each other and remedy label scarcity, two experts synergize with each other in a mutually supervised way to exploit the enormous unlabeled data. Extensive experiments on 3 real-world datasets under different extents of input data scarcity and/or label scarcity demonstrate our model outperforms states of the art significantly and steadily, with a maximum improvement of 53.53%. We also test our model without any data scarcity and it still outperforms current methods.
Problem

Research questions and friction points this paper is trying to address.

Predict drug-target interactions with scarce input data.
Address label scarcity in drug-target interaction prediction.
Fuse intrinsic and extrinsic data for improved prediction accuracy.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Separate experts process intrinsic and extrinsic data
Experts synergize in mutually supervised learning
Adaptive fusion of data enhances prediction accuracy
🔎 Similar Papers
No similar papers found.
X
Xinlong Zhai
Beijing University of Posts and Telecommunications
C
Chuncheng Wang
Beijing University of Posts and Telecommunications
R
Ruijia Wang
China Telecom Cloud Computing Research Institute
J
Jiazheng Kang
Beijing University of Posts and Telecommunications
S
Shujie Li
Beijing University of Posts and Telecommunications
Boyu Chen
Boyu Chen
The University of Sydney
Neural Architecture SearchTransformer
Tengfei Ma
Tengfei Ma
Stony Brook University
Natural Language ProcessingMachine LearningHealthcareGraph Neural Networks
Zikai Zhou
Zikai Zhou
Hong Kong University of Science and Technology(Guang Zhou)
Data-centric AIDiffusion ModelsAutoregressive ModelAIGC
C
Cheng Yang
Beijing University of Posts and Telecommunications
Chuan Shi
Chuan Shi
Beijing University of Posts and Telecommunications
data miningmachine learningsocial network analysis