🤖 AI Summary
To address the frequent failure of external sensors (e.g., vision, LiDAR) in extreme environments—such as space, military operations, and underwater settings—this paper proposes a robust robot control framework relying solely on proprioceptive inputs (e.g., joint torques and positions). The method introduces, for the first time, long-horizon contact memory into a diffusion model: it dynamically selects salient historical proprioceptive states to construct policy inputs, integrates proprioceptive feature encoding, and employs end-to-end imitation learning to generate adaptive behaviors without exteroceptive sensing. Evaluations on both UR10e simulation and physical platforms demonstrate that the approach significantly improves success rates in contact-intensive tasks. Notably, it maintains high robustness even under complete exteroceptor failure. These results empirically validate the decisive role of long-term proprioceptive memory in enabling reliable exteroception-free manipulation.
📝 Abstract
Diffusion models have revolutionized imitation learning, allowing robots to replicate complex behaviours. However, diffusion often relies on cameras and other exteroceptive sensors to observe the environment and lacks long-term memory. In space, military, and underwater applications, robots must be highly robust to failures in exteroceptive sensors, operating using only proprioceptive information. In this paper, we propose ProDapt, a method of incorporating long-term memory of previous contacts between the robot and the environment in the diffusion process, allowing it to complete tasks using only proprioceptive data. This is achieved by identifying"keypoints", essential past observations maintained as inputs to the policy. We test our approach using a UR10e robotic arm in both simulation and real experiments and demonstrate the necessity of this long-term memory for task completion.