Automated and Interpretable Survival Analysis from Multimodal Data

📅 2025-09-25
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the challenges of multimodal data fusion and limited model interpretability in head-and-neck cancer survival analysis, this paper proposes MultiFIX—a deep learning framework that jointly encodes CT imaging and clinical variables. It integrates Grad-CAM to localize prognostically relevant lesion regions in CT scans and employs genetic programming to quantify the contribution of clinical features. Risk stratification is performed via an interpretable Cox regression module. MultiFIX achieves high predictive accuracy while ensuring transparent, clinically grounded decision-making. Evaluated on the RADCURE dataset, it attains a concordance index (C-index) of 0.838 for survival prediction and 0.826 for risk stratification—surpassing both clinical baselines and state-of-the-art academic models. Critically, the identified imaging biomarkers and clinical predictors strongly align with established prognostic markers, demonstrating clinical plausibility. MultiFIX thus provides a trustworthy, interpretable AI foundation for precision oncology.

Technology Category

Application Category

📝 Abstract
Accurate and interpretable survival analysis remains a core challenge in oncology. With growing multimodal data and the clinical need for transparent models to support validation and trust, this challenge increases in complexity. We propose an interpretable multimodal AI framework to automate survival analysis by integrating clinical variables and computed tomography imaging. Our MultiFIX-based framework uses deep learning to infer survival-relevant features that are further explained: imaging features are interpreted via Grad-CAM, while clinical variables are modeled as symbolic expressions through genetic programming. Risk estimation employs a transparent Cox regression, enabling stratification into groups with distinct survival outcomes. Using the open-source RADCURE dataset for head and neck cancer, MultiFIX achieves a C-index of 0.838 (prediction) and 0.826 (stratification), outperforming the clinical and academic baseline approaches and aligning with known prognostic markers. These results highlight the promise of interpretable multimodal AI for precision oncology with MultiFIX.
Problem

Research questions and friction points this paper is trying to address.

Developing interpretable AI for multimodal survival analysis in oncology
Integrating clinical variables and CT imaging for transparent risk prediction
Automating feature extraction and stratification using explainable deep learning
Innovation

Methods, ideas, or system contributions that make the work stand out.

Integrates clinical and imaging data via multimodal AI
Uses Grad-CAM and genetic programming for feature interpretation
Employs transparent Cox regression for risk stratification
Mafalda Malafaia
Mafalda Malafaia
Centrum Wiskunde & Informatica
eXplainable AIMultimodalityAI for Health
P
Peter A. N. Bosman
Centrum Wiskunde & Informatica, Amsterdam, The Netherlands.; Delft University of Technology, Delft, The Netherlands.
C
Coen Rasch
Leiden University Medical Center, Leiden, The Netherlands.
Tanja Alderliesten
Tanja Alderliesten
Leiden University Medical Center (LUMC)
Radiation OncologyMedical Image ProcessingArtificial Intelligence