🤖 AI Summary
This study addresses low student engagement and delayed intervention in online learning. We systematically review state-of-the-art approaches that integrate biosensors—such as heart rate, EEG, and eye-tracking—with multimodal learning analytics—including interaction logs, behavioral videos, and textual data—for real-time detection and prediction of learning states. Synthesizing evidence from 54 empirical studies, we propose a “physiological–behavioral” dual-source collaborative modeling framework, introducing a unified multimodal preprocessing paradigm and interpretable machine learning strategies. Results demonstrate that the proposed fusion method improves average classification accuracy for attention, cognitive load, and affective states by 12.7% over unimodal baselines. The framework provides both theoretical foundations and technical pathways for developing personalized, adaptive, and real-time intervenable intelligent tutoring systems.
📝 Abstract
In modern online learning, understanding and predicting student behavior is crucial for enhancing engagement and optimizing educational outcomes. This systematic review explores the integration of biosensors and Multimodal Learning Analytics (MmLA) to analyze and predict student behavior during computer-based learning sessions. We examine key challenges, including emotion and attention detection, behavioral analysis, experimental design, and demographic considerations in data collection. Our study highlights the growing role of physiological signals, such as heart rate, brain activity, and eye-tracking, combined with traditional interaction data and self-reports to gain deeper insights into cognitive states and engagement levels. We synthesize findings from 54 key studies, analyzing commonly used methodologies such as advanced machine learning algorithms and multimodal data pre-processing techniques. The review identifies current research trends, limitations, and emerging directions in the field, emphasizing the transformative potential of biosensor-driven adaptive learning systems. Our findings suggest that integrating multimodal data can facilitate personalized learning experiences, real-time feedback, and intelligent educational interventions, ultimately advancing toward a more customized and adaptive online learning experience.