Intuitive Human-Drone Collaborative Navigation in Unknown Environments through Mixed Reality

📅 2025-04-02
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the challenges of rigid remote control, weak spatial situational awareness, and low collaboration efficiency for unmanned aerial vehicles (UAVs) in unknown environments, this paper proposes a mixed reality (MR)-enabled human-UAV collaborative navigation system. The system introduces a novel bidirectional spatial information sharing mechanism between head-mounted displays (HMDs) and UAVs, integrating simultaneous localization and mapping (SLAM), spatial semantic mapping, and natural interaction techniques. This elevates traditional command-based control to immersive, semantics-aware collaboration, enabling real-time intent alignment in dynamic scenarios. A user study conducted in a simulated post-disaster environment demonstrates that, compared to first-person view (FPV) systems, the proposed approach reduces task completion time by 37%, decreases operational error rate by 62%, and improves spatial situational understanding accuracy by 58%. This work significantly enhances human-UAV spatial co-presence and collaborative safety, establishing a new paradigm for intuitive, high-reliability human-UAV shared autonomy in unstructured environments.

Technology Category

Application Category

📝 Abstract
Considering the widespread integration of aerial robots in inspection, search and rescue, and monitoring tasks, there is a growing demand to design intuitive human-drone interfaces. These aim to streamline and enhance the user interaction and collaboration process during drone navigation, ultimately expediting mission success and accommodating users' inputs. In this paper, we present a novel human-drone mixed reality interface that aims to (a) increase human-drone spatial awareness by sharing relevant spatial information and representations between the human equipped with a Head Mounted Display (HMD) and the robot and (b) enable safer and intuitive human-drone interactive and collaborative navigation in unknown environments beyond the simple command and control or teleoperation paradigm. We validate our framework through extensive user studies and experiments in a simulated post-disaster scenarios, comparing its performance against a traditional First-Person View (FPV) control systems. Furthermore, multiple tests on several users underscore the advantages of the proposed solution, which offers intuitive and natural interaction with the system. This demonstrates the solution's ability to assist humans during a drone navigation mission, ensuring its safe and effective execution.
Problem

Research questions and friction points this paper is trying to address.

Design intuitive human-drone interfaces for better interaction
Enhance spatial awareness in human-drone collaborative navigation
Enable safer navigation in unknown environments using mixed reality
Innovation

Methods, ideas, or system contributions that make the work stand out.

Mixed reality interface for human-drone collaboration
Shared spatial awareness via Head Mounted Display
Intuitive navigation beyond teleoperation in unknowns
🔎 Similar Papers
No similar papers found.
S
Sanket A. Salunkhe
New York University, Tandon School of Engineering, Brooklyn, NY 11201, USA
P
Pranav Nedunghat
New York University, Tandon School of Engineering, Brooklyn, NY 11201, USA
L
Luca Morando
New York University, Tandon School of Engineering, Brooklyn, NY 11201, USA
Nishanth Bobbili
Nishanth Bobbili
University of California, Berkeley
Optimal ControlMotion Planning
Guanrui Li
Guanrui Li
Assistant Professor, Robotics Engineering, Worcester Polytechnic Institute
Aerial RoboticsMulti-Robot Systems
Giuseppe Loianno
Giuseppe Loianno
UC Berkeley
RoboticsMAVsVisionSensor Fusion