🤖 AI Summary
To address the insufficient robustness of LiDAR–camera extrinsic calibration in outdoor and extraterrestrial multi-robot systems—caused by target degradation and sensor deterioration—this paper proposes a cooperative calibration method using spherical targets. We innovatively design a hierarchical weighted accumulation algorithm for precise 3D sphere center extraction from point clouds, and integrate SAM-based image segmentation, distortion-resistant elliptical center detection, and perspective projection error correction to achieve high-accuracy 2D/3D center registration under severe target degradation and complex interference. Furthermore, we formulate a joint optimization framework combining 2D ellipse fitting and 3D sphere fitting to robustly estimate the extrinsic transformation matrix. Experiments across multiple LiDAR models and in realistic outdoor and planetary-analogue environments demonstrate that our method significantly outperforms conventional target-based approaches in both calibration accuracy and stability.
📝 Abstract
This paper presents a novel spherical target-based LiDAR-camera extrinsic calibration method designed for outdoor environments with multi-robot systems, considering both target and sensor corruption. The method extracts the 2D ellipse center from the image and the 3D sphere center from the pointcloud, which are then paired to compute the transformation matrix. Specifically, the image is first decomposed using the Segment Anything Model (SAM). Then, a novel algorithm extracts an ellipse from a potentially corrupted sphere, and the extracted center of ellipse is corrected for errors caused by the perspective projection model. For the LiDAR pointcloud, points on the sphere tend to be highly noisy due to the absence of flat regions. To accurately extract the sphere from these noisy measurements, we apply a hierarchical weighted sum to the accumulated pointcloud. Through experiments, we demonstrated that the sphere can be robustly detected even under both types of corruption, outperforming other targets. We evaluated our method using three different types of LiDARs (spinning, solid-state, and non-repetitive) with cameras positioned in three different locations. Furthermore, we validated the robustness of our method to target corruption by experimenting with spheres subjected to various types of degradation. These experiments were conducted in both a planetary test and a field environment. Our code is available at https://github.com/sparolab/MARSCalib.