Texture-Aware StarGAN for CT data harmonisation

📅 2025-03-19
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
CT reconstruction kernel discrepancies induce texture inconsistencies, severely limiting the generalization capability of deep learning models. To address this, we propose the first texture-aware StarGAN framework tailored for CT data harmonization, enabling one-to-many style transfer and standardization across reconstruction kernels. Our method innovatively introduces a multi-scale texture loss that jointly models spatial- and angular-domain texture features to explicitly suppress kernel-dependent artifacts. It further integrates multi-scale LPIPS constraints, adversarial loss, and cycle-consistency loss, implemented with a ResNet-based generator and PatchGAN discriminator. Evaluated on 197 cases comprising 48,667 chest CT slices reconstructed with three distinct kernels, our approach achieves a 12.3% improvement in SSIM and a 28.7% reduction in FID. Radiologist assessments confirm statistically significant superiority over baseline StarGAN. This work establishes a novel, interpretable, and robust paradigm for CT image standardization.

Technology Category

Application Category

📝 Abstract
Computed Tomography (CT) plays a pivotal role in medical diagnosis; however, variability across reconstruction kernels hinders data-driven approaches, such as deep learning models, from achieving reliable and generalized performance. To this end, CT data harmonization has emerged as a promising solution to minimize such non-biological variances by standardizing data across different sources or conditions. In this context, Generative Adversarial Networks (GANs) have proved to be a powerful framework for harmonization, framing it as a style-transfer problem. However, GAN-based approaches still face limitations in capturing complex relationships within the images, which are essential for effective harmonization. In this work, we propose a novel texture-aware StarGAN for CT data harmonization, enabling one-to-many translations across different reconstruction kernels. Although the StarGAN model has been successfully applied in other domains, its potential for CT data harmonization remains unexplored. Furthermore, our approach introduces a multi-scale texture loss function that embeds texture information across different spatial and angular scales into the harmonization process, effectively addressing kernel-induced texture variations. We conducted extensive experimentation on a publicly available dataset, utilizing a total of 48667 chest CT slices from 197 patients distributed over three different reconstruction kernels, demonstrating the superiority of our method over the baseline StarGAN.
Problem

Research questions and friction points this paper is trying to address.

Address variability in CT reconstruction kernels
Enhance deep learning model generalization in medical imaging
Improve texture-aware harmonization across different CT data sources
Innovation

Methods, ideas, or system contributions that make the work stand out.

Texture-aware StarGAN for CT harmonization
Multi-scale texture loss function integration
One-to-many translations across reconstruction kernels
🔎 Similar Papers
No similar papers found.