Improving Inclusivity for Emotion Recognition Based on Face Tracking

📅 2025-05-07
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Virtual avatars in MR/VR suffer from limited emotional expressiveness, and existing facial-tracking-based emotion recognition methods exhibit poor generalizability due to substantial inter-individual variability in facial expressions. Method: This paper introduces the first multi-strategy personalized calibration framework for inclusive emotion recognition. It integrates real-time facial Action Unit (AU) dynamic modeling, a lightweight user self-calibration mechanism, and incremental transfer learning—jointly balancing population-level generalizability and user-specific expressivity to systematically mitigate recognition bias arising from cross-user expression diversity. Contribution/Results: Evaluated on a diverse cohort spanning age, skin tone, and neurodiverse populations, the framework achieves a 23.6% improvement in emotion recognition accuracy and reduces F1-score variance by 41%, significantly enhancing model fairness, robustness, and inclusivity.

Technology Category

Application Category

📝 Abstract
The limited expressiveness of virtual user representations in Mixed Reality and Virtual Reality can inhibit an integral part of communication: emotional expression. Emotion recognition based on face tracking is often used to compensate for this. However, emotional facial expressions are highly individual, which is why many approaches have difficulties recognizing unique variations of emotional expressions. We propose several strategies to improve face tracking systems for emotion recognition with and without user intervention for the Affective Interaction Workshop at CHI '25.
Problem

Research questions and friction points this paper is trying to address.

Enhancing emotion recognition in virtual environments
Addressing individual variations in facial expressions
Improving face tracking systems for inclusivity
Innovation

Methods, ideas, or system contributions that make the work stand out.

Enhancing face tracking for emotion recognition
Addressing individual emotional expression variations
Proposing user-intervention and non-intervention strategies
🔎 Similar Papers
No similar papers found.
Mats Ole Ellenberg
Mats Ole Ellenberg
Interactive Media Lab Dresden, TUD Dresden University of Technology
K
Katja Krug
Interactive Media Lab Dresden, TU Dresden University of Technology, Germany