🤖 AI Summary
This study addresses two key challenges in art therapy: the abstraction of emotional expression and insufficient interoceptive awareness. To tackle these, we propose an embodied art intervention integrating mixed reality (MR) with multimodal biosensing—capturing respiration, heart rate variability (HRV), and eye-tracking data. Physiological signals are mapped in real time onto interactive 3D “affective artifacts,” scaffolded by low-cognitive-load analog practices such as clay modeling and hand drawing, forming a trauma-informed, body-centered workflow. We introduce three conceptual innovations: (1) the “3D affective artifact” framework; (2) a design vocabulary for biosignal expressivity; and (3) a constrained generative set for affective archiving. Empirical evaluation demonstrates that the MR system significantly enhances interoceptive literacy. The work yields reusable design principles, validated signal-to-form mapping prototypes, and boundary guidelines for automated translation—establishing both a theoretical foundation and a practical paradigm for embodied digital art therapy. (149 words)
📝 Abstract
This in-person studio explores how mixed reality (MR) and biometrics can make intangible emotional states tangible through embodied art practices. We begin with two well-established modalities, clay sculpting and free-form 2D drawing, to ground participants in somatic awareness and manual, reflective expression. Building on this baseline, we introduce an MR prototype that maps physiological signals (e.g., breath, heart rate variability, eye movement dynamics) to visual and spatial parameters (color saturation, pulsing, motion qualities), generating ''3D emotional artifacts.'' The full-day program balances theory (somatic psychology, embodied cognition, expressive biosignals), hands-on making, and comparative reflection to interrogate what analog and digital modalities respectively afford for awareness, expression, and meaning-making. Participants will (1) experience and compare analog and MR-based journaling of emotion; (2) prototype and critique mappings from biosignals to visual/spatial feedback; and (3) articulate design principles for trauma-informed, hybrid workflows that amplify interoceptive literacy without overwhelming the user. The expected contributions include a shared design vocabulary for biometric expressivity, a set of generative constraints for future TEI work on emotional archiving, and actionable insights into when automated translation supports or hinders embodied connection.