🤖 AI Summary
This study investigates the temporal dynamics of emotion-specific facial mimicry and its modulation by personality traits. Using high-temporal-resolution facial Action Unit (AU) coding, we quantified participants’ automatic mimicry trajectories in response to fear, anger, and other emotional stimuli via Dynamic Time Warping (DTW). The Big Five Inventory assessed extraversion and agreeableness as potential moderators. Results revealed that fear elicited the strongest mimicry, whereas anger elicited the weakest. Extraversion and agreeableness significantly and positively predicted overall mimicry intensity, with medium effect sizes (η² = 0.12–0.18). This work represents the first application of DTW to model the fine-grained temporal structure of emotional mimicry, uncovering differential personality-based modulation across emotion categories. By establishing computationally tractable, empirically grounded links between personality, temporal mimicry patterns, and emotion type, our findings provide actionable behavioral principles for designing empathic virtual agents with adaptive, human-like affective responsiveness.
📝 Abstract
Facial mimicry - the automatic, unconscious imitation of others' expressions - is vital for emotional understanding. This study investigates how mimicry differs across emotions using Face Action Units from videos and participants' responses. Dynamic Time Warping quantified the temporal alignment between participants' and stimuli's facial expressions, revealing significant emotional variations. Post-hoc tests indicated greater mimicry for 'Fear' than 'Happy' and reduced mimicry for 'Anger' compared to 'Fear'. The mimicry correlations with personality traits like Extraversion and Agreeableness were significant, showcasing subtle yet meaningful connections. These findings suggest specific emotions evoke stronger mimicry, with personality traits playing a secondary role in emotional alignment. Notably, our results highlight how personality-linked mimicry mechanisms extend beyond interpersonal communication to affective computing applications, such as remote human-human interactions and human-virtual-agent scenarios. Insights from temporal facial mimicry - e.g., designing digital agents that adaptively mirror user expressions - enable developers to create empathetic, personalized systems, enhancing emotional resonance and user engagement.