🤖 AI Summary
To address the challenge of identifying vulnerable road users (VRUs) under metamerism—where spectrally distinct objects appear identical under urban lighting conditions—this paper proposes a hyperspectral band selection method integrating Contrast Signal-to-Noise Ratio (CSNR) and Joint Mutual Information Maximization (JMIM). It is the first to incorporate image quality metrics into band optimization to enhance spectral separability between VRUs and background. Evaluated on the H-City dataset, the method identifies three optimal bands: 497 nm, 607 nm, and 895 nm (±27 nm). Compared to RGB, these bands improve Euclidean distance, Spectral Angle Mapper (SAM), Hotelling’s T² statistic, and CIE ΔE by 70.24%, 528.46%, 1206.83%, and 246.62%, respectively, significantly boosting perception robustness. This work establishes a novel paradigm for lightweight deployment of hyperspectral sensing in ADAS and autonomous driving systems.
📝 Abstract
Protecting Vulnerable Road Users (VRU) is a critical safety challenge for automotive perception systems, particularly under visual ambiguity caused by metamerism, a phenomenon where distinct materials appear similar in RGB imagery. This work investigates hyperspectral imaging (HSI) to overcome this limitation by capturing unique material signatures beyond the visible spectrum, especially in the Near-Infrared (NIR). To manage the inherent high-dimensionality of HSI data, we propose a band selection strategy that integrates information theory techniques (joint mutual information maximization, correlation analysis) with a novel application of an image quality metric (contrast signal-to-noise ratio) to identify the most spectrally informative bands. Using the Hyperspectral City V2 (H-City) dataset, we identify three informative bands (497 nm, 607 nm, and 895 nm, $pm$27 nm) and reconstruct pseudo-color images for comparison with co-registered RGB. Quantitative results demonstrate increased dissimilarity and perceptual separability of VRU from the background. The selected HSI bands yield improvements of 70.24%, 528.46%, 1206.83%, and 246.62% for dissimilarity (Euclidean, SAM, $T^2$) and perception (CIE $ΔE$) metrics, consistently outperforming RGB and confirming a marked reduction in metameric confusion. By providing a spectrally optimized input, our method enhances VRU separability, establishing a robust foundation for downstream perception tasks in Advanced Driver Assistance Systems (ADAS) and Autonomous Driving (AD), ultimately contributing to improved road safety.