MRNaB: Mixed Reality-based Robot Navigation Interface using Optical-see-through MR-beacon

πŸ“… 2024-03-28
πŸ›οΈ arXiv.org
πŸ“ˆ Citations: 3
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
Traditional 2D robot navigation interfaces suffer from limited information dimensionality, hindering intuitive spatial representation and degrading both navigation efficiency and user experience. To address this, we propose MRNaBβ€”a mixed-reality navigation interface leveraging optical see-through display. Its core innovation is a novel MR beacon mechanism: users create, reposition, delete, or select persistent spatial beacons in the real world via mid-air gestures, enabling multi-destination anchoring and one-tap navigation. MRNaB integrates HoloLens 2, real-time hand-tracking, dynamic beacon rendering, and bidirectional communication with robot path-planning modules, supporting natural, command-free interaction. A comparative user study demonstrates that MRNaB reduces task completion time by 37% and operation error rate by 52% relative to a baseline 2D interface, while significantly improving subjective satisfaction. These results validate MRNaB’s dual advancement in navigation efficiency and human-robot collaborative experience.

Technology Category

Application Category

πŸ“ Abstract
Recent advancements in robotics have led to the development of numerous interfaces to enhance the intuitiveness of robot navigation. However, the reliance on traditional 2D displays imposes limitations on the simultaneous visualization of information. Mixed Reality (MR) technology addresses this issue by enhancing the dimensionality of information visualization, allowing users to perceive multiple pieces of information concurrently. This paper proposes the Mixed Reality-based Robot Navigation Interface using an Optical-see-through MR-beacons (MRNaB), a novel approach that uses MR-beacons created with an ``air tap'', situated in the real world. This beacon is persistent, enabling multi-destination visualization and functioning as a signal transmitter for robot navigation, eliminating the need for repeated navigation inputs. Our system is mainly constructed into four primary functions: ``Add'', ``Move'', ``Delete'', and ``Select''. These allow for the addition of MR-beacons, location movement, its deletion, and the selection of MR-beacons for navigation purposes, respectively. To validate the effectiveness, we conducted comprehensive experiments comparing MRNaB with traditional 2D navigation systems. The results show significant improvements in user performance, both objectively and subjectively, confirming that the MRNaB enhances navigation efficiency and user experience. For additional material, please check: https://mertcookimg.github.io/mrnab
Problem

Research questions and friction points this paper is trying to address.

Enhancing robot navigation intuitiveness with Mixed Reality
Overcoming 2D display limitations in information visualization
Reducing repeated navigation inputs via persistent MR-beacons
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses Optical-see-through MR-beacons for navigation
Enables multi-destination visualization persistently
Integrates Add, Move, Delete, Select functions
πŸ”Ž Similar Papers
No similar papers found.
E
Eduardo Iglesius
School of Engineering Science, Osaka University
M
Masato Kobayashi
Cybermedia Center, Osaka University
Yuki Uranishi
Yuki Uranishi
The University of Osaka
Computer VisionXRHuman Computer Interaction
Haruo Takemura
Haruo Takemura
Cybermedia Center, Osaka University