๐ค AI Summary
This study addresses diminished social expressivity among individuals with upper-limb motor impairments. We propose an augmented bodily expression system integrating large language models (LLMs) with a modular robotic arm. The system enables collaborative gesture co-design between users and caregivers, leverages multimodal contextual awareness and real-time social situational understanding to generate adaptive limb expressions, and introduces the novel โKinetic Memoryโ framework for personalized, iterative gesture co-creation. As the first work to synergize LLM-driven semantic interpretation with robot-mediated limb motion for socio-affective (non-functional) expression, it demonstrates significant improvements in a user study with six participants: average communication confidence increased by 42%, gesture suggestion adoption rate reached 78%, and empirical validation confirmed both the feasibility and user acceptability of robotic arms for non-functional social expression.
๐ Abstract
Individuals with upper limb movement limitations face challenges in interacting with others. Although robotic arms are currently used primarily for functional tasks, there is considerable potential to explore ways to enhance users' body language capabilities during social interactions. This paper introduces an Augmented Body Communicator system that integrates robotic arms and a large language model. Through the incorporation of kinetic memory, disabled users and their supporters can collaboratively design actions for the robot arm. The LLM system then provides suggestions on the most suitable action based on contextual cues during interactions. The system underwent thorough user testing with six participants who have conditions affecting upper limb mobility. Results indicate that the system improves users' ability to express themselves. Based on our findings, we offer recommendations for developing robotic arms that support disabled individuals with body language capabilities and functional tasks.