🤖 AI Summary
This study addresses the critical problem of emotional expression loss in sign language translation, investigating how affect is conveyed through the coordinated interplay of manual (handshape, movement) and non-manual features (facial expressions, head/upper-body posture). Employing in-depth qualitative interviews with eight signers across diverse cultural and auditory backgrounds—analyzed via cross-cultural comparison and human-centered design principles—the work systematically identifies universal patterns of emotional expression in sign language, alongside culturally and individually specific variations. It isolates key non-manual cues essential for emotion recognition and pinpoints fundamental bottlenecks in current sign language translation systems’ affective modeling capabilities. The study proposes foundational design principles for emotion-aware sign language technologies. By bridging a significant gap in affective computing for sign languages, it provides both theoretical grounding and practical guidance for developing embodied, emotionally adaptive sign language interfaces.
📝 Abstract
Significant advances have been made in our ability to understand and generate emotionally expressive content such as text and speech, yet comparable progress in sign language technologies remain limited. While computational approaches to sign language translation have focused on capturing lexical content, the emotional dimensions of sign language communication remain largely unexplored. Through semi-structured interviews with eight sign language users across Singapore, Sri Lanka and the United States, including both Deaf and Hard of hearing (DHH) and hearing signers, we investigate how emotions are expressed and perceived in sign languages. Our findings highlight the role of both manual and non-manual elements in emotional expression, revealing universal patterns as well as individual and cultural variations in how signers communicate emotions. We identify key challenges in capturing emotional nuance for sign language translation, and propose design considerations for developing more emotionally-aware sign language technologies. This work contributes to both theoretical understanding of emotional expression in sign language and practical development of interfaces to better serve diverse signing communities.