CAR-LOAM: Color-Assisted Robust LiDAR Odometry and Mapping

📅 2025-02-24
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Conventional LiDAR odometry and mapping (LOAM) methods suffer from insufficient robustness in complex outdoor environments—e.g., forests and campuses—due to sparse, ambiguous, or dynamic geometric features. Method: This paper proposes a tightly coupled LiDAR–camera fusion framework: (i) colorizing LiDAR point clouds using synchronized camera imagery; (ii) formulating a perception-uniform CIEDE2000 color-difference-weighted correspondence matching strategy to reject chromatic outliers; (iii) integrating edge/planar geometric features with a Welsch-robust-kernel-based variant of ICP for outlier-resilient pose optimization; and (iv) enforcing consistency constraints between RGB and geometric feature spaces. Results: Experiments on challenging outdoor sequences demonstrate substantial improvements in localization accuracy and mapping robustness. The method generates dense, photorealistic, full-color 3D maps, outperforming state-of-the-art LOAM and LiDAR–vision fusion approaches in both quantitative metrics and qualitative fidelity.

Technology Category

Application Category

📝 Abstract
In this letter, we propose a color-assisted robust framework for accurate LiDAR odometry and mapping (LOAM). Simultaneously receiving data from both the LiDAR and the camera, the framework utilizes the color information from the camera images to colorize the LiDAR point clouds and then performs iterative pose optimization. For each LiDAR scan, the edge and planar features are extracted and colored using the corresponding image and then matched to a global map. Specifically, we adopt a perceptually uniform color difference weighting strategy to exclude color correspondence outliers and a robust error metric based on the Welsch's function to mitigate the impact of positional correspondence outliers during the pose optimization process. As a result, the system achieves accurate localization and reconstructs dense, accurate, colored and three-dimensional (3D) maps of the environment. Thorough experiments with challenging scenarios, including complex forests and a campus, show that our method provides higher robustness and accuracy compared with current state-of-the-art methods.
Problem

Research questions and friction points this paper is trying to address.

Enhances LiDAR odometry with color data
Improves 3D mapping accuracy using color
Robust pose optimization against outliers
Innovation

Methods, ideas, or system contributions that make the work stand out.

Color-assisted LiDAR odometry
Perceptually uniform color weighting
Welsch's function robust error
🔎 Similar Papers
No similar papers found.
Y
Yufei Lu
School of Optics and Photonics, Beijing Institute of Technology, Beijing 100081, China
Yuetao Li
Yuetao Li
Beijing Institute of Technology
Robotics3D Vision
Z
Zhizhou Jia
School of Optics and Photonics, Beijing Institute of Technology, Beijing 100081, China
Q
Qun Hao
School of Optics and Photonics, Beijing Institute of Technology, Beijing 100081, China
Shaohui Zhang
Shaohui Zhang
Dongguan University of Technology
Fault diagnosis