🤖 AI Summary
This study investigates how age modulates perception of affective bodily expressions in the humanoid robot NAO, comparing children, young adults, and older adults. Using a cross-sectional behavioral experiment, we systematically assessed recognition accuracy and response patterns for four basic emotions (happiness, anger, sadness, fear). Results reveal a striking similarity in emotion recognition performance between children and older adults—both significantly underperforming relative to young adults—thereby identifying an empirically grounded “bipolar age similarity” effect. Critically, NAO demonstrated robust cross-generational expressivity, particularly for emotions with high intersubjective consensus. This work provides the first empirical evidence of age-related convergence at developmental extremes in socio-affective robot perception. It advances foundational knowledge for designing emotionally intelligent social robots tailored to aging populations, offering empirically validated cognitive principles and concrete pathways for age-inclusive interaction design.
📝 Abstract
This paper presents an empirical study investigating how individuals across different age groups, children, young and older adults, interpret emotional body language expressed by the humanoid robot NAO. The aim is to offer insights into how users perceive and respond to emotional cues from robotic agents, through an empirical evaluation of the robot's effectiveness in conveying emotions to different groups of users. By analyzing data collected from elderly participants and comparing these findings with previously gathered data from young adults and children, the study highlights similarities and differences between the groups, with younger and older users more similar but different from young adults.