Event-based Motion&Appearance Fusion for 6D Object Pose Tracking

📅 2026-03-09
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge of robust 6D object pose tracking in high-speed dynamic scenes, where conventional RGB-D cameras suffer from motion blur and limited frame rates. The authors propose a learning-free fusion strategy that integrates the high temporal resolution motion cues from an event camera with appearance-based template matching. Specifically, 6D velocity is propagated via event-based optical flow, and the resulting pose estimate is refined through local template matching. The method achieves tracking performance comparable to or better than state-of-the-art approaches on fast-moving objects, significantly enhancing robustness in highly dynamic environments. This advancement demonstrates the strong potential of event cameras for real-time 6D pose tracking applications.

Technology Category

Application Category

📝 Abstract
Object pose tracking is a fundamental and essential task for robotics to perform tasks in the home and industrial settings. The most commonly used sensors to do so are RGB-D cameras, which can hit limitations in highly dynamic environments due to motion blur and frame-rate constraints. Event cameras have remarkable features such as high temporal resolution and low latency, which make them a potentially ideal vision sensors for object pose tracking at high speed. Even so, there are still only few works on 6D pose tracking with event cameras. In this work, we take advantage of the high temporal resolution and propose a method that uses both a propagation step fused with a pose correction strategy. Specifically, we use 6D object velocity obtained from event-based optical flow for pose propagation, after which, a template-based local pose correction module is utilized for pose correction. Our learning-free method has comparable performance to the state-of-the-art algorithms, and in some cases out performs them for fast-moving objects. The results indicate the potential for using event cameras in highly-dynamic scenarios where the use of deep network approaches are limited by low update rates.
Problem

Research questions and friction points this paper is trying to address.

6D object pose tracking
event cameras
highly dynamic environments
motion blur
temporal resolution
Innovation

Methods, ideas, or system contributions that make the work stand out.

event camera
6D pose tracking
pose propagation
template-based correction
optical flow
🔎 Similar Papers
No similar papers found.