🤖 AI Summary
This work addresses the lack of specialized RGBD datasets for gesture-based control of unmanned ground vehicles (UGVs) by frontline rescue personnel, a gap that hinders human-robot collaboration in disaster scenarios. To bridge this void, the authors introduce and publicly release FR-GESTURE, the first open dataset comprising 3,312 RGBD samples of 12 tactical hand signals derived from real-world rescue operations. These gestures were refined through feedback from actual emergency responders and captured across two camera viewpoints and seven distances. The dataset incorporates multimodal data collection, a standardized evaluation protocol, and baseline experimental results, thereby establishing a foundational resource for advancing research in intuitive, gesture-driven UGV control within complex disaster environments.
📝 Abstract
The ever increasing intensity and number of disasters make even more difficult the work of First Responders (FRs). Artificial intelligence and robotics solutions could facilitate their operations, compensating these difficulties. To this end, we propose a dataset for gesture-based UGV control by FRs, introducing a set of 12 commands, drawing inspiration from existing gestures used by FRs and tactical hand signals and refined after incorporating feedback from experienced FRs. Then we proceed with the data collection itself, resulting in 3312 RGBD pairs captured from 2 viewpoints and 7 distances. To the best of our knowledge, this is the first dataset especially intended for gesture-based UGV guidance by FRs. Finally we define evaluation protocols for our RGBD dataset, termed FR-GESTURE, and we perform baseline experiments, which are put forward for improvement. We have made data publicly available to promote future research on the domain: https://doi.org/10.5281/zenodo.18131333.