Adam SLAM - the last mile of camera calibration with 3DGS

📅 2025-08-28
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
In real-world scenarios, camera calibration often lacks ground-truth parameters, limiting the quality of 3D reconstruction and novel-view synthesis. To address this, we propose an end-to-end differentiable camera calibration method based on 3D Gaussian Splatting (3DGS). Our approach jointly optimizes camera poses and intrinsics by backpropagating the color loss from differentiable novel-view rendering directly to camera parameters, using the Adam optimizer for gradient-based updates. Crucially, it requires no ground-truth calibration data and operates effectively in unstructured, prior-free real-world scenes. Evaluated on standard 3DGS benchmark datasets, our method achieves an average PSNR improvement of 0.4 dB over baseline methods, demonstrating enhanced geometric fidelity and rendering quality. This work establishes a robust, learning-based calibration paradigm for high-precision neural rendering, eliminating reliance on traditional calibration targets or manual initialization.

Technology Category

Application Category

📝 Abstract
The quality of the camera calibration is of major importance for evaluating progresses in novel view synthesis, as a 1-pixel error on the calibration has a significant impact on the reconstruction quality. While there is no ground truth for real scenes, the quality of the calibration is assessed by the quality of the novel view synthesis. This paper proposes to use a 3DGS model to fine tune calibration by backpropagation of novel view color loss with respect to the cameras parameters. The new calibration alone brings an average improvement of 0.4 dB PSNR on the dataset used as reference by 3DGS. The fine tuning may be long and its suitability depends on the criticity of training time, but for calibration of reference scenes, such as Mip-NeRF 360, the stake of novel view quality is the most important.
Problem

Research questions and friction points this paper is trying to address.

Improving camera calibration accuracy through 3DGS backpropagation
Reducing calibration errors that impact novel view synthesis quality
Optimizing camera parameters using color loss from novel views
Innovation

Methods, ideas, or system contributions that make the work stand out.

3DGS model fine-tunes camera calibration
Backpropagates novel view color loss
Improves PSNR by 0.4 dB
🔎 Similar Papers
No similar papers found.