🤖 AI Summary
This paper introduces Scene-agnostic Pose Regression (SPR), a novel task for 6D camera pose estimation in unknown environments—without retraining or reliance on image retrieval databases. To address this, we present 360SPR, the first large-scale multi-height panoramic dataset, and propose SPR-Mamba, a dual-branch architecture built upon the Mamba framework that jointly models panoramic and pinhole imagery. We further introduce a scene-agnostic generalization training paradigm to enhance cross-scene robustness. Extensive experiments demonstrate that our method significantly outperforms state-of-the-art absolute/relative pose regression and visual odometry approaches on both the 360SPR and 360Loc benchmarks under unseen scenes, reducing pose estimation error by up to 32%. This work establishes a new lightweight, open-environment visual localization paradigm grounded in scene-agnostic generalization.
📝 Abstract
Absolute Pose Regression (APR) predicts 6D camera poses but lacks the adaptability to unknown environments without retraining, while Relative Pose Regression (RPR) generalizes better yet requires a large image retrieval database. Visual Odometry (VO) generalizes well in unseen environments but suffers from accumulated error in open trajectories. To address this dilemma, we introduce a new task, Scene-agnostic Pose Regression (SPR), which can achieve accurate pose regression in a flexible way while eliminating the need for retraining or databases. To benchmark SPR, we created a large-scale dataset, 360SPR, with over 200K photorealistic panoramas, 3.6M pinhole images and camera poses in 270 scenes at three different sensor heights. Furthermore, a SPR-Mamba model is initially proposed to address SPR in a dual-branch manner. Extensive experiments and studies demonstrate the effectiveness of our SPR paradigm, dataset, and model. In the unknown scenes of both 360SPR and 360Loc datasets, our method consistently outperforms APR, RPR and VO. The dataset and code are available at https://junweizheng93.github.io/publications/SPR/SPR.html.