🤖 AI Summary
To address the scarcity of thermal imaging data, insufficient simulation fidelity, and limited diversity in existing RGB-to-thermal translation methods for Advanced Driver Assistance Systems (ADAS), this paper proposes Component-aware Adaptive Instance Normalization (CoAdaIN)—a fine-grained, component-level style transfer technique that overcomes the limitations of conventional global AdaIN normalization. Building upon CoAdaIN, we design an end-to-end multimodal translation framework enabling one-to-many thermal image generation. Experiments demonstrate that our method synthesizes high-fidelity, diverse thermal images across various driving scenarios. Consequently, it significantly enhances the robustness and generalization capability of ADAS environmental perception under low-light and adverse weather conditions. This work establishes a novel paradigm for thermal data augmentation and simulation-based system modeling in autonomous driving.
📝 Abstract
Thermal imaging in Advanced Driver Assistance Systems (ADAS) improves road safety with superior perception in low-light and harsh weather conditions compared to traditional RGB cameras. However, research in this area faces challenges due to limited dataset availability and poor representation in driving simulators. RGB-to-thermal image translation offers a potential solution, but existing methods focus on one-to-one mappings. We propose a one-to-many mapping using a multi-modal translation framework enhanced with our Component-aware Adaptive Instance Normalization (CoAdaIN). Unlike the original AdaIN, which applies styles globally, CoAdaIN adapts styles to different image components individually. The result, as we show, is more realistic and diverse thermal image translations. This is the accepted author manuscript of the paper published in IEEE Sensors Conference 2024. The final published version is available at 10.1109/SENSORS60989.2024.10785056.