3HANDS Dataset: Learning from Humans for Generating Naturalistic Handovers with Supernumerary Robotic Limbs

📅 2025-03-06
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study addresses the need for natural human–handover interactions with hip-mounted supernumerary robotic limbs (SRLs) in intimate personal spaces. To this end, we introduce 3HANDS—the first dedicated dataset for SRL–human handover interactions. Motivated by three key human behavioral characteristics—primary task execution, implicit coordination, and personal space constraints—we propose the first unified generative framework integrating trajectory generation, endpoint selection, and timing prediction. The framework leverages multimodal motion-capture data from real human collaborative handovers and combines conditional generative models (e.g., conditional VAEs and diffusion models) with temporal behavior prediction techniques. A user study (N=10) demonstrates that our generated interactions significantly improve perceived naturalness and comfort while reducing physical workload, outperforming heuristic baselines across all metrics.

Technology Category

Application Category

📝 Abstract
Supernumerary robotic limbs (SRLs) are robotic structures integrated closely with the user's body, which augment human physical capabilities and necessitate seamless, naturalistic human-machine interaction. For effective assistance in physical tasks, enabling SRLs to hand over objects to humans is crucial. Yet, designing heuristic-based policies for robots is time-consuming, difficult to generalize across tasks, and results in less human-like motion. When trained with proper datasets, generative models are powerful alternatives for creating naturalistic handover motions. We introduce 3HANDS, a novel dataset of object handover interactions between a participant performing a daily activity and another participant enacting a hip-mounted SRL in a naturalistic manner. 3HANDS captures the unique characteristics of SRL interactions: operating in intimate personal space with asymmetric object origins, implicit motion synchronization, and the user's engagement in a primary task during the handover. To demonstrate the effectiveness of our dataset, we present three models: one that generates naturalistic handover trajectories, another that determines the appropriate handover endpoints, and a third that predicts the moment to initiate a handover. In a user study (N=10), we compare the handover interaction performed with our method compared to a baseline. The findings show that our method was perceived as significantly more natural, less physically demanding, and more comfortable.
Problem

Research questions and friction points this paper is trying to address.

Enabling supernumerary robotic limbs to perform naturalistic object handovers.
Overcoming limitations of heuristic-based policies for robotic handover motions.
Capturing unique characteristics of human-SRL interactions for effective assistance.
Innovation

Methods, ideas, or system contributions that make the work stand out.

3HANDS dataset captures naturalistic SRL handovers.
Generative models create human-like handover trajectories.
Models predict handover endpoints and initiation timing.
🔎 Similar Papers
2024-07-16Neural Information Processing SystemsCitations: 16
A
Artin Saberpour Abadian
Saarland University, Saarland Informatics Campus, Saarbrücken, Germany
Y
Yi-Chi Liao
ETH Zürich, Zürich, Switzerland
A
Ata Otaran
Saarland University, Saarland Informatics Campus, Saarbrücken, Germany
Rishabh Dabral
Rishabh Dabral
Max-Planck Institute for Informatics
Computer VisionDeep LearningHuman Pose Estimation
M
Marie Muehlhaus
Saarland University, Saarland Informatics Campus, Saarbrücken, Germany
C
C. Theobalt
Max Planck Institute for Informatics, Saarbrücken, Germany; Saarland University, Saarland Informatics Campus, Saarbrücken, Germany
M
Martin Schmitz
Saarland University, Saarland Informatics Campus, Saarbrücken, Germany
Jürgen Steimle
Jürgen Steimle
Professor of Computer Science, Saarland University, Saarland Informatics Campus
HCIMobile DevicesWearable ComputingHaptic InterfacesDigital Fabrication