🤖 AI Summary
This study addresses the challenge of limited model generalization in cross-subject EEG classification due to inter-subject and inter-session heterogeneity. It presents the first analysis of cross-subject EEG applicability from the perspective of reducibility cost theory and introduces a mutual-guidance expert collaboration framework. Within this framework, shared experts learn reducible domain-invariant features, while routing experts model irreducible domain-specific characteristics. A mutual-guidance regularization mechanism is incorporated to jointly optimize the system, preventing both over- and under-reduction. Built upon a Mixture-of-Experts (MoE) architecture, the proposed method outperforms current state-of-the-art approaches across seven benchmark datasets and demonstrates theoretical validity through experiments on synthetic data.
📝 Abstract
Decoding the human brain from electroencephalography (EEG) signals holds promise for understanding neurological activities. However, EEG data exhibit heterogeneity across subjects and sessions, limiting the generalization of existing methods. Representation learning approaches sacrifice subject-specific information for domain invariance, while ensemble learning methods risk error accumulation for unseen subjects. From a theoretical perspective, we reveal that the applicability of these paradigms depends on the reducibility cost of domain-specific functions to domain-invariant ones. Building on this insight, we propose a Mutual-Guided Expert Collaboration (MGEC) framework that employs distinct network structures aligned with domain-specific and domain-invariant functions. Shared expert-guided learning captures reducible domain-invariant functions. Routed expert-guided learning employs a mixture-of-experts architecture to model irreducible domain-specific functions. Mutual-guided learning enables collaborative regularization to prevent over-reduction and under-reduction. We validate our theoretical findings on synthetic datasets, and experiments on seven benchmarks demonstrate that MGEC outperforms state-of-the-art methods.