Advancing Brainwave Modeling with a Codebook-Based Foundation Model

📅 2025-05-22
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing EEG pre-trained models inadequately model neural oscillatory features, limiting generalization and performance across BCI tasks. To address this, we propose LaBraM++, the first foundational EEG model integrating signal-processing priors with codebook-enhanced representation learning. LaBraM++ introduces band-guided learnable vector quantization, time-frequency domain self-supervised pre-training, and a lightweight adaptation head—collectively overcoming representational capacity bottlenecks and significantly enhancing oscillatory information capture. Evaluated across diverse BCI paradigms—including motor imagery, SSVEP, and ERP classification—LaBraM++ consistently outperforms state-of-the-art baselines, achieving an average accuracy improvement of 4.2% and a 32% reduction in training time. It establishes new open-source SOTA performance among EEG foundation models.

Technology Category

Application Category

📝 Abstract
Recent advances in large-scale pre-trained Electroencephalogram (EEG) models have shown great promise, driving progress in Brain-Computer Interfaces (BCIs) and healthcare applications. However, despite their success, many existing pre-trained models have struggled to fully capture the rich information content of neural oscillations, a limitation that fundamentally constrains their performance and generalizability across diverse BCI tasks. This limitation is frequently rooted in suboptimal architectural design choices which constrain their representational capacity. In this work, we introduce LaBraM++, an enhanced Large Brainwave Foundation Model (LBM) that incorporates principled improvements grounded in robust signal processing foundations. LaBraM++ demonstrates substantial gains across a variety of tasks, consistently outperforming its originally-based architecture and achieving competitive results when compared to other open-source LBMs. Its superior performance and training efficiency highlight its potential as a strong foundation for future advancements in LBMs.
Problem

Research questions and friction points this paper is trying to address.

Existing EEG models fail to capture neural oscillations fully
Suboptimal architecture limits model performance and generalizability
LaBraM++ improves signal processing for better BCI task results
Innovation

Methods, ideas, or system contributions that make the work stand out.

Codebook-based EEG model for brainwave modeling
Enhanced Large Brainwave Foundation Model (LaBraM++)
Robust signal processing for improved performance
🔎 Similar Papers
No similar papers found.