Understanding State Social Anxiety in Virtual Social Interactions using Multimodal Wearable Sensing Indicators

📅 2025-03-19
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study addresses the challenge of dynamically detecting state social anxiety during virtual social interactions. We propose a minute-level multimodal modeling approach leveraging wearable physiological sensing. Using the Empatica E4, we continuously采集 PPG, EDA, skin temperature, and acceleration signals during Zoom-based dyadic and small-group interactions, synchronized with fine-grained, three-phase (anticipatory, experiential, reflective) anxiety annotations at 2–6 minute resolution—the first such effort at this temporal granularity. Innovatively integrating situational context (e.g., interaction type, speaker role) with individual trait-level mental health data, we develop a mixed-effects logistic regression model validated via leave-one-subject-out cross-validation. A physiology-only model achieves 59% accuracy; incorporating multimodal contextual features elevates accuracy to 69%–84%, substantially improving discrimination between high- and low-state-anxiety episodes. Our framework establishes a novel, interpretable, and robust paradigm for real-time psychological state monitoring in virtual environments.

Technology Category

Application Category

📝 Abstract
Mobile sensing is ubiquitous and offers opportunities to gain insight into state mental health functioning. Detecting state elevations in social anxiety would be especially useful given this phenomenon is highly prevalent and impairing, but often not disclosed. Although anxiety is highly dynamic, fluctuating rapidly over the course of minutes, most work to date has examined anxiety at a scale of hours, days, or longer. In the present work, we explore the feasibility of detecting fluctuations in state social anxiety among N = 46 undergraduate students with elevated symptoms of trait social anxiety. Participants engaged in two dyadic and two group social interactions via Zoom. We evaluated participants' state anxiety levels as they anticipated, immediately after experiencing, and upon reflecting on each social interaction, spanning a time frame of 2-6 minutes. We collected biobehavioral features (i.e., PPG, EDA, skin temperature, and accelerometer) via Empatica E4 devices as they participated in the varied social contexts (e.g., dyadic vs. group; anticipating vs. experiencing the interaction; experiencing varying levels of social evaluation). We additionally measured their trait mental health functioning. Mixed-effect logistic regression and leave-one-subject-out machine learning modeling indicated biobehavioral features significantly predict state fluctuations in anxiety, though balanced accuracy tended to be modest (59%). However, our capacity to identify instances of heightened versus low state anxiety significantly increased (with balanced accuracy ranging from 69% to 84% across different operationalizations of state anxiety) when we integrated contextual data alongside trait mental health functioning into our predictive models.. We discuss these and other findings in the context of the broader anxiety detection literature.
Problem

Research questions and friction points this paper is trying to address.

Detecting rapid fluctuations in state social anxiety using wearable sensors.
Exploring biobehavioral features to predict anxiety during virtual social interactions.
Improving anxiety detection accuracy by integrating contextual and trait mental health data.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Multimodal wearable sensing for anxiety detection
Real-time biobehavioral feature analysis
Contextual data integration in predictive models
🔎 Similar Papers
No similar papers found.
M
Maria A. Larrazabal
Department of Psychology, University of Virginia, USA
Z
Zhiyuan Wang
Department of Systems and Information Engineering, University of Virginia, USA
Mark Rucker
Mark Rucker
UVA PhD Candidate
Contetxual BanditsReinforcement LearningKernel MethodsBehavioral HealthBehavior Change Interventions
E
Emma R. Toner
Department of Psychology, University of Virginia, USA
M
M. Boukhechba
Johnson & Johnson Innovative Medicine, USA
B
B. Teachman
Department of Psychology, University of Virginia, USA
L
Laura E. Barnes
Department of Systems and Information Engineering, University of Virginia, USA