🤖 AI Summary
Novice practitioners often struggle to sustain attention during mindfulness meditation. To address this, we propose an EEG-driven virtual reality (VR) mindfulness training system that dynamically modulates fractal visual stimuli—generated via self-similar algorithms—and interactive auditory feedback—mapped in real time to neural activity—based on decoded electroencephalographic signals. This constitutes the first integration of closed-loop neurofeedback with a fractal-art-driven VR paradigm for adaptive attention regulation. Preliminary user studies demonstrate statistically significant improvements in both focus and relaxation (p < 0.05), validating the system’s efficacy and translational potential for mindfulness support. Our core contribution is a novel, scalable neuroaesthetic interaction framework that unifies real-time EEG decoding, generative fractal visualization, and adaptive audio feedback—establishing a foundation for personalized, biologically grounded immersive training systems.
📝 Abstract
Mindfulness has been studied and practiced in enhancing psychological well-being while reducing neuroticism and psychopathological indicators. However, practicing mindfulness with continuous attention is challenging, especially for beginners. In the proposed system, FractalBrain, we utilize an interactive audiovisual fractal with a geometric repetitive pattern that has been demonstrated to induce meditative effects. FractalBrain presents an experience combining a surreal virtual reality (VR) program with an electroencephalogram (EEG) interface. While viewing an ever-changing fractal-inspired artwork in an immersive environment, the user's EEG stream is analyzed and mapped into VR. These EEG data adaptively manipulates the audiovisual parameters in real-time, generating a distinct experience for each user. The pilot feedback suggests the potential of the FractalBrain to facilitate mindfulness and enhance attention.