🤖 AI Summary
Addressing practical challenges in EEG-based affective computing—including substantial inter-subject variability, severe label noise, and continuous streams of unlabeled data—this paper proposes SSOCL, a dual-level self-supervised continual learning framework. Methodologically, SSOCL introduces a novel “dynamic memory buffer optimization + pseudo-label co-refinement” mechanism to jointly enhance cross-subject generalization and knowledge retention under unsupervised streaming conditions. It integrates self-supervised representation learning, cluster-based label mapping, and a fast adaptation module into an iterative optimization architecture. Evaluated on two benchmark EEG datasets—DEAP and SEED—SSOCL consistently outperforms state-of-the-art continual learning and domain adaptation methods. Crucially, it maintains high robustness and superior cross-subject generalization accuracy throughout the continual learning process, demonstrating strong adaptability to real-world deployment constraints.
📝 Abstract
Emotion recognition through physiological signals such as electroencephalogram (EEG) has become an essential aspect of affective computing and provides an objective way to capture human emotions. However, physiological data characterized by cross-subject variability and noisy labels hinder the performance of emotion recognition models. Existing domain adaptation and continual learning methods struggle to address these issues, especially under realistic conditions where data is continuously streamed and unlabeled. To overcome these limitations, we propose a novel bi-level self-supervised continual learning framework, SSOCL, based on a dynamic memory buffer. This bi-level architecture iteratively refines the dynamic buffer and pseudo-label assignments to effectively retain representative samples, enabling generalization from continuous, unlabeled physiological data streams for emotion recognition. The assigned pseudo-labels are subsequently leveraged for accurate emotion prediction. Key components of the framework, including a fast adaptation module and a cluster-mapping module, enable robust learning and effective handling of evolving data streams. Experimental validation on two mainstream EEG tasks demonstrates the framework's ability to adapt to continuous data streams while maintaining strong generalization across subjects, outperforming existing approaches.