EPOFusion: Exposure aware Progressive Optimization Method for Infrared and Visible Image Fusion

📅 2026-03-17
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the critical issue of key information loss in overexposed regions prevalent in existing infrared and visible image fusion methods. To this end, we propose an exposure-aware progressive optimization fusion model that leverages a guidance module to extract fine-grained infrared features from overexposed areas, integrates them through a multi-scale context-aware iterative decoder to progressively enhance fusion quality, and employs an adaptive loss function to dynamically balance modality contributions under varying exposure conditions. We further introduce the first high-quality infrared-visible overexposure dataset (IVOE) to facilitate training and evaluation. Experimental results demonstrate that our method not only preserves visual fidelity in non-overexposed regions but also significantly improves the recovery of infrared cues in overexposed areas, outperforming state-of-the-art approaches and enhancing downstream task performance.

Technology Category

Application Category

📝 Abstract
Overexposure frequently occurs in practical scenarios, causing the loss of critical visual information. However, existing infrared and visible fusion methods still exhibit unsatisfactory performance in highly bright regions. To address this, we propose EPOFusion, an exposure-aware fusion model. Specifically, a guidance module is introduced to facilitate the encoder in extracting fine-grained infrared features from overexposed regions. Meanwhile, an iterative decoder incorporating a multiscale context fusion module is designed to progressively enhance the fused image, ensuring consistent details and superior visual quality. Finally, an adaptive loss function dynamically constrains the fusion process, enabling an effective balance between the modalities under varying exposure conditions. To achieve better exposure awareness, we construct the first infrared and visible overexposure dataset (IVOE) with high quality infrared guided annotations for overexposed regions. Extensive experiments show that EPOFusion outperforms existing methods. It maintains infrared cues in overexposed regions while achieving visually faithful fusion in non-overexposed areas, thereby enhancing both visual fidelity and downstream task performance. Code, fusion results and IVOE dataset will be made available at https://github.com/warren-wzw/EPOFusion.git.
Problem

Research questions and friction points this paper is trying to address.

infrared and visible image fusion
overexposure
highly bright regions
visual information loss
exposure awareness
Innovation

Methods, ideas, or system contributions that make the work stand out.

exposure-aware fusion
infrared-visible image fusion
overexposure handling
multiscale context fusion
adaptive loss function
🔎 Similar Papers
No similar papers found.
Z
Zhiwei Wang
College of Information Engineering, Zhejiang University of Technology, Hangzhou 310023, China
Y
Yayu Zheng
College of Information Engineering, Zhejiang University of Technology, Hangzhou 310023, China
D
Defeng He
College of Information Engineering, Zhejiang University of Technology, Hangzhou 310023, China
Li Zhao
Li Zhao
Zhejiang University
Magnetic Resonance ImagingArterial Spin Labeling
X
Xiaoqin Zhang
College of Information Engineering, Zhejiang University of Technology, Hangzhou 310023, China
Y
Yuxing Li
Department of Electrical and Electronic Engineering, The University of Hong Kong, Hong Kong, China
E
Edmund Y. Lam
Department of Electrical and Electronic Engineering, The University of Hong Kong, Hong Kong, China