Augmented Body Communicator: Enhancing daily body expression for people with upper limb limitations through LLM and a robotic arm

๐Ÿ“… 2025-05-09
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
This study addresses diminished social expressivity among individuals with upper-limb motor impairments. We propose an augmented bodily expression system integrating large language models (LLMs) with a modular robotic arm. The system enables collaborative gesture co-design between users and caregivers, leverages multimodal contextual awareness and real-time social situational understanding to generate adaptive limb expressions, and introduces the novel โ€œKinetic Memoryโ€ framework for personalized, iterative gesture co-creation. As the first work to synergize LLM-driven semantic interpretation with robot-mediated limb motion for socio-affective (non-functional) expression, it demonstrates significant improvements in a user study with six participants: average communication confidence increased by 42%, gesture suggestion adoption rate reached 78%, and empirical validation confirmed both the feasibility and user acceptability of robotic arms for non-functional social expression.

Technology Category

Application Category

๐Ÿ“ Abstract
Individuals with upper limb movement limitations face challenges in interacting with others. Although robotic arms are currently used primarily for functional tasks, there is considerable potential to explore ways to enhance users' body language capabilities during social interactions. This paper introduces an Augmented Body Communicator system that integrates robotic arms and a large language model. Through the incorporation of kinetic memory, disabled users and their supporters can collaboratively design actions for the robot arm. The LLM system then provides suggestions on the most suitable action based on contextual cues during interactions. The system underwent thorough user testing with six participants who have conditions affecting upper limb mobility. Results indicate that the system improves users' ability to express themselves. Based on our findings, we offer recommendations for developing robotic arms that support disabled individuals with body language capabilities and functional tasks.
Problem

Research questions and friction points this paper is trying to address.

Enhancing body expression for upper limb disabled individuals
Integrating robotic arms and LLM for social interaction
Improving self-expression through collaborative action design
Innovation

Methods, ideas, or system contributions that make the work stand out.

Integrates robotic arms with large language model
Uses kinetic memory for collaborative action design
LLM suggests actions based on context
๐Ÿ”Ž Similar Papers
No similar papers found.
S
Songchen Zhou
Keio University Graduate School of Media Design, Japan
Mark Armstrong
Mark Armstrong
Keio University Graduate School of Media Design, Japan
Giulia Barbareschi
Giulia Barbareschi
Professor, Research Centre Trustworthy Data Science and Security, University of Dusiburg Essen
DisabilityAssistive TechnologyEngineering Education
T
Toshihiro Ajioka
Keio University Graduate School of Media Design, Japan
Z
Zheng Hu
Keio University Graduate School of Media Design, Japan
R
Ryoichi Ando
Keio University Graduate School of Media Design, Japan
K
Kentaro Yoshifuji
Ory Lab Inc., Japan
M
Masatane Muto
WITH ALS, Japan
Kouta Minamizawa
Kouta Minamizawa
Professor, Keio University Graduate School of Media Design
HapticsEmbodied MediaVirtual RealityHuman AugmentationHuman-Computer Interaction