Examining the legibility of humanoid robot arm movements in a pointing task

📅 2025-08-07
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study investigates the comprehensibility of humanoid robot pointing gestures—specifically, how humans predict robotic intent from truncated motion and multimodal bodily cues, particularly eye gaze and arm kinematics, to enhance safety and interpretability in human–robot interaction (HRI). Using the NICO robot, we employed trajectory truncation, touchscreen-based target presentation, and a factorial experimental design manipulating gaze–pointing congruency, while recording behavioral responses and eye-tracking data. Results demonstrate that multimodal cue integration—especially synergistic gaze and pointing—significantly improves intention prediction accuracy. Critically, this work provides the first empirical validation in HRI of the “gaze-priority” hypothesis: ocular cues dominate predictive inference, and their primacy is independent of arm motion completeness. These findings offer foundational cognitive insights for designing robot action policies with enhanced legibility and anticipatory transparency.

Technology Category

Application Category

📝 Abstract
Human--robot interaction requires robots whose actions are legible, allowing humans to interpret, predict, and feel safe around them. This study investigates the legibility of humanoid robot arm movements in a pointing task, aiming to understand how humans predict robot intentions from truncated movements and bodily cues. We designed an experiment using the NICO humanoid robot, where participants observed its arm movements towards targets on a touchscreen. Robot cues varied across conditions: gaze, pointing, and pointing with congruent or incongruent gaze. Arm trajectories were stopped at 60% or 80% of their full length, and participants predicted the final target. We tested the multimodal superiority and ocular primacy hypotheses, both of which were supported by the experiment.
Problem

Research questions and friction points this paper is trying to address.

Investigates legibility of humanoid robot arm movements
Examines human prediction of robot intentions from truncated movements
Tests multimodal superiority and ocular primacy hypotheses
Innovation

Methods, ideas, or system contributions that make the work stand out.

Humanoid robot arm movement analysis
Multimodal gaze and pointing cues
Truncated trajectory prediction testing
🔎 Similar Papers
No similar papers found.
Andrej Lúčny
Andrej Lúčny
Comenius University, Bratislava
real-time cognitive architecturescomputer visionrobot control and simulation
Matilde Antonj
Matilde Antonj
University of Genoa, Italian Institute of Technology
Human-Robot InteractionCognitive RoboticsModelling Human Perception and Action
Carlo Mazzola
Carlo Mazzola
Istituto Italiano di Tecnologia (CONTACT)
Human-Robot InteractionCognitive RoboticsSocial RoboticsPerceptionDeep Learning
H
Hana Hornáčková
Faculty of Mathematics, Physics and Informatics, Comenius University Bratislava, Slovakia
A
Ana Farić
Faculty of Education, University of Ljubljana, Slovenia
K
Kristína Malinovská
Faculty of Mathematics, Physics and Informatics, Comenius University Bratislava, Slovakia
M
Michal Vavrečka
Faculty of Mathematics, Physics and Informatics, Comenius University Bratislava, Slovakia
Igor Farkaš
Igor Farkaš
Professor of Informatics, Comenius University in Bratislava
Neural NetworksCognitive ScienceArtificial IntelligenceRobotics