🤖 AI Summary
To address the complexity and high development barrier in designing human–robot collaborative workflows for augmented reality (AR) environments, this paper introduces ARTHUR, an open-source AR authoring tool. ARTHUR pioneers a hybrid user interface (HUI) that unifies desktop, touchscreen, and AR head-mounted display modalities, enabling visual programming and on-site tuning of multimodal feedback, control actions, and triggering conditions. We propose a composable interaction modeling framework comprising 20 feedback types, 10 action primitives, and 18 condition predicates, supporting zero-code AR workflow construction. ARTHUR integrates AR rendering, multimodal interaction handling, robot control interfaces, and open-source visual programming capabilities, successfully replicating multiple representative collaborative task scenarios. A five-participant user study demonstrates that ARTHUR significantly lowers the development threshold while exhibiting strong practicality and extensibility.
📝 Abstract
While augmented reality shows promise for supporting human-robot collaboration, creating such interactive systems still poses great challenges. Addressing this, we introduce ARTHUR, an open-source authoring tool for augmented reality-supported human-robot collaboration. ARTHUR supports 20 types of multi-modal feedback to convey robot, task, and system state, 10 actions that enable the user to control the robot and system, and 18 conditions for feedback customization and triggering of actions. By combining these elements, users can create interaction spaces, controls, and information visualizations in augmented reality for collaboration with robot arms. With ARTHUR, we propose to combine desktop interfaces and touchscreen devices for effective authoring, with head-mounted displays for testing and in-situ refinements. To demonstrate the general applicability of ARTHUR for human-robot collaboration scenarios, we replicate representative examples from prior work. Further, in an evaluation with five participants, we reflect on the usefulness of our hybrid user interface approach and the provided functionality, highlighting directions for future work.