🤖 AI Summary
This work addresses the challenges of poor interaction robustness and low sample efficiency in legged robots when manipulating heterogeneous articulated objects—such as doors and drawers—due to their diverse joint types and complex dynamics. To overcome these issues, the authors propose an efficient reinforcement learning framework that enables cross-object generalization through geometric abstraction and multimodal perception fusion. Specifically, they introduce a Sampling-based Abstract Feature Extraction (SAFE) method to encode handle and panel geometries into low-dimensional features, enhancing generalization. Additionally, an Articulation Information Estimator (ArtIEst) is designed to adaptively fuse proprioceptive and exteroceptive signals for accurate estimation of object motion direction and range. Experiments on both simulated and real-world legged robot platforms demonstrate the approach’s high robustness and sample efficiency across a variety of heterogeneous articulated objects.
📝 Abstract
Legged manipulators offer high mobility and versatile manipulation. However, robust interaction with heterogeneous articulated objects, such as doors, drawers, and cabinets, remains challenging because of the diverse articulation types of the objects and the complex dynamics of the legged robot. Existing reinforcement learning (RL)-based approaches often rely on high-dimensional sensory inputs, leading to sample inefficiency. In this paper, we propose a robust and sample-efficient framework for opening heterogeneous articulated objects with a legged manipulator. In particular, we propose Sampling-based Abstracted Feature Extraction (SAFE), which encodes handle and panel geometry into a compact low-dimensional representation, improving cross-domain generalization. Additionally, Articulation Information Estimator (ArtIEst) is introduced to adaptively mix proprioception with exteroception to estimate opening direction and range of motion for each object. The proposed framework was deployed to manipulate various heterogeneous articulated objects in simulation and real-world robot systems. Videos can be found on the project website: https://openheart-icra.github.io/OpenHEART/