Perception-Control Coupled Visual Servoing for Textureless Objects Using Keypoint-Based EKF

📅 2026-02-06
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the instability and low accuracy of visual servoing for textureless objects under adverse visual conditions such as occlusion, where conventional feature-based methods fail due to insufficient visual cues. To overcome this limitation, the authors propose a tightly coupled perception-control closed-loop visual servoing framework. The approach integrates learned keypoint detection with an Extended Kalman Filter (EKF) to jointly estimate the object’s 6D pose, which drives a pose-based visual servoing (PBVS) controller. Camera motion feedback is incorporated to enhance the robustness of keypoint tracking. Furthermore, a novel uncertainty-aware probabilistic control law is introduced to enable safe and precise manipulation of textureless objects. Real-world robotic experiments demonstrate that the proposed method significantly outperforms traditional visual servoing techniques in both pose estimation accuracy and grasping success rate.

Technology Category

Application Category

📝 Abstract
Visual servoing is fundamental to robotic applications, enabling precise positioning and control. However, applying it to textureless objects remains a challenge due to the absence of reliable visual features. Moreover, adverse visual conditions, such as occlusions, often corrupt visual feedback, leading to reduced accuracy and instability in visual servoing. In this work, we build upon learning-based keypoint detection for textureless objects and propose a method that enhances robustness by tightly integrating perception and control in a closed loop. Specifically, we employ an Extended Kalman Filter (EKF) that integrates per-frame keypoint measurements to estimate 6D object pose, which drives pose-based visual servoing (PBVS) for control. The resulting camera motion, in turn, enhances the tracking of subsequent keypoints, effectively closing the perception-control loop. Additionally, unlike standard PBVS, we propose a probabilistic control law that computes both camera velocity and its associated uncertainty, enabling uncertainty-aware control for safe and reliable operation. We validate our approach on real-world robotic platforms using quantitative metrics and grasping experiments, demonstrating that our method outperforms traditional visual servoing techniques in both accuracy and practical application.
Problem

Research questions and friction points this paper is trying to address.

visual servoing
textureless objects
occlusions
visual feedback
6D object pose
Innovation

Methods, ideas, or system contributions that make the work stand out.

Perception-Control Coupling
Keypoint-Based EKF
Textureless Object Servoing
Uncertainty-Aware Control
Pose-Based Visual Servoing
🔎 Similar Papers
No similar papers found.
Allen Tao
Allen Tao
University of Toronto
roboticscomputer visiondeep learning
Jun Yang
Jun Yang
Epson Canada
SLAMRobot LearningComputer VisionMachine Learning
S
Stanko Oparnica
Epson Canada Ltd., Toronto, Canada
W
Wenjie Xue
Epson Canada Ltd., Toronto, Canada