Robust LiDAR-Camera Calibration With 2D Gaussian Splatting

πŸ“… 2025-04-01
πŸ›οΈ IEEE Robotics and Automation Letters
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
This paper addresses the challenging problem of targetless LiDAR–camera extrinsic calibration. We propose an end-to-end differentiable calibration method based on neural rendering. Our key contribution is the first introduction of a LiDAR-point-cloud-driven 2D Gaussian rasterization scheme as a color-free texture representation, eliminating reliance on scene texture and illumination conditions. We formulate a unified optimization framework integrating photometric, reprojection, and triangulation geometric constraints to ensure geometrically consistent extrinsic estimation. The method requires no artificial calibration targets (e.g., checkerboards) or auxiliary objects and supports fully differentiable end-to-end training. Evaluated on real-world scenes, it achieves sub-pixel reprojection accuracy and reduces calibration error by 37% over state-of-the-art methods. It demonstrates significantly enhanced robustness and practicality in texture-poor, low-light, and dynamic environments.

Technology Category

Application Category

πŸ“ Abstract
LiDAR-camera systems have become increasingly popular in robotics recently. A critical and initial step in integrating the LiDAR and camera data is the calibration of the LiDAR-camera system. Most existing calibration methods rely on auxiliary target objects, which often involve complex manual operations, whereas targetless methods have yet to achieve practical effectiveness. Recognizing that 2D Gaussian Splatting (2DGS) can reconstruct geometric information from camera image sequences, we propose a calibration method that estimates LiDAR-camera extrinsic parameters using geometric constraints. The proposed method begins by reconstructing colorless 2DGS using LiDAR point clouds. Subsequently, we update the colors of the Gaussian splats by minimizing the photometric loss. The extrinsic parameters are optimized during this process. Additionally, we address the limitations of the photometric loss by incorporating the reprojection and triangulation losses, thereby enhancing the calibration robustness and accuracy.
Problem

Research questions and friction points this paper is trying to address.

Calibrating LiDAR-camera systems without target objects
Improving calibration accuracy using 2D Gaussian Splatting
Enhancing robustness with reprojection and triangulation losses
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses 2D Gaussian Splatting for geometric reconstruction
Optimizes extrinsic parameters via photometric loss
Enhances robustness with reprojection and triangulation losses
πŸ”Ž Similar Papers
No similar papers found.
S
Shuyi Zhou
The Institute of Industrial Science, The University of Tokyo, Japan
S
Shuxiang Xie
The Institute of Industrial Science, The University of Tokyo, Japan
R
R. Ishikawa
The Institute of Industrial Science, The University of Tokyo, Japan
Takeshi Oishi
Takeshi Oishi
Associate Professor of Institute of Industrial Science, The University of Tokyo
Computer Vision