Spiking monocular event based 6D pose estimation for space application

📅 2025-01-06
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the low accuracy and poor real-time performance of monocular vision-based 6D pose estimation in on-orbit servicing and space debris removal, this paper proposes the first event-driven pose estimation framework tailored for space applications. We introduce SEENIC, the first real-world event-camera dataset for spacecraft pose estimation, and design S2E2—a lightweight spiking neural network enabling end-to-end, monocular event-stream-driven 6D pose estimation. Our approach synergistically integrates event-based representation learning, monocular geometric priors, and a brain-inspired, ultra-low-power computational architecture, enabling real-time onboard embedded deployment. Evaluated on SEENIC, the method achieves an average positional error of 21 cm and angular error of 14°, demonstrating—for the first time—the feasibility of high-accuracy spacecraft pose estimation using pure event streams. This work establishes a novel paradigm for autonomous space operations.

Technology Category

Application Category

📝 Abstract
With the growing interest in on On-orbit servicing (OOS) and Active Debris Removal (ADR) missions, spacecraft poses estimation algorithms are being developed using deep learning to improve the precision of this complex task and find the most efficient solution. With the advances of bio-inspired low-power solutions, such a spiking neural networks and event-based processing and cameras, and their recent work for space applications, we propose to investigate the feasibility of a fully event-based solution to improve event-based pose estimation for spacecraft. In this paper, we address the first event-based dataset SEENIC with real event frames captured by an event-based camera on a testbed. We show the methods and results of the first event-based solution for this use case, where our small spiking end-to-end network (S2E2) solution achieves interesting results over 21cm position error and 14degree rotation error, which is the first step towards fully event-based processing for embedded spacecraft pose estimation.
Problem

Research questions and friction points this paper is trying to address.

Space Mission Efficiency
Object Pose Estimation
Single Camera Tracking
Innovation

Methods, ideas, or system contributions that make the work stand out.

Event-based Camera
Event-driven Estimation
Spiking Neural Networks
🔎 Similar Papers
No similar papers found.