SAM2-Aug: Prior knowledge-based Augmentation for Target Volume Auto-Segmentation in Adaptive Radiation Therapy Using Segment Anything Model 2

๐Ÿ“… 2025-07-25
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
To address the labor-intensive, time-consuming, and poor generalizability of manual tumor segmentation in adaptive radiotherapy (ART), this paper proposes a prior-knowledge-enhanced SAM2 framework. Specifically, registered MR images and coarse annotations serve as contextual inputs; robustness to prompts is improved via randomized bounding-box expansion and morphological mask operations; and transfer learning and prompt engineering fine-tuning are conducted across multi-center, multi-sequence MRI datasets. The method significantly enhances SAM2โ€™s anatomical boundary delineation accuracy and cross-tumor-type and cross-sequence generalization capability in medical imaging. Evaluated on multi-center abdominal and brain MRI datasets, it achieves Dice scores of 0.86โ€“0.90โ€”outperforming state-of-the-art CNN-, Transformer-, and prompt-driven models. To our knowledge, this is the first work to achieve high-accuracy, fully automated target segmentation with SAM2 in clinical ART workflows.

Technology Category

Application Category

๐Ÿ“ Abstract
Purpose: Accurate tumor segmentation is vital for adaptive radiation therapy (ART) but remains time-consuming and user-dependent. Segment Anything Model 2 (SAM2) shows promise for prompt-based segmentation but struggles with tumor accuracy. We propose prior knowledge-based augmentation strategies to enhance SAM2 for ART. Methods: Two strategies were introduced to improve SAM2: (1) using prior MR images and annotations as contextual inputs, and (2) improving prompt robustness via random bounding box expansion and mask erosion/dilation. The resulting model, SAM2-Aug, was fine-tuned and tested on the One-Seq-Liver dataset (115 MRIs from 31 liver cancer patients), and evaluated without retraining on Mix-Seq-Abdomen (88 MRIs, 28 patients) and Mix-Seq-Brain (86 MRIs, 37 patients). Results: SAM2-Aug outperformed convolutional, transformer-based, and prompt-driven models across all datasets, achieving Dice scores of 0.86(liver), 0.89(abdomen), and 0.90(brain). It demonstrated strong generalization across tumor types and imaging sequences, with improved performance in boundary-sensitive metrics. Conclusions: Incorporating prior images and enhancing prompt diversity significantly boosts segmentation accuracy and generalizability. SAM2-Aug offers a robust, efficient solution for tumor segmentation in ART. Code and models will be released at https://github.com/apple1986/SAM2-Aug.
Problem

Research questions and friction points this paper is trying to address.

Enhancing tumor segmentation accuracy in adaptive radiation therapy
Improving SAM2 model performance with prior knowledge augmentation
Addressing prompt robustness and generalization in medical image segmentation
Innovation

Methods, ideas, or system contributions that make the work stand out.

Prior MR images enhance SAM2 segmentation accuracy
Random bounding box expansion improves prompt robustness
Fine-tuned SAM2-Aug achieves high Dice scores
๐Ÿ”Ž Similar Papers
No similar papers found.
Guoping Xu
Guoping Xu
UTSW, WIT
Medical Image SegmentationDisease QuantificationComputer Vision
Yan Dai
Yan Dai
The Medical Artificial Intelligence and Automation (MAIA) Laboratory, Department of Radiation Oncology, University of Texas Southwestern Medical Center, Dallas, TX 75390, USA
H
Hengrui Zhao
The Medical Artificial Intelligence and Automation (MAIA) Laboratory, Department of Radiation Oncology, University of Texas Southwestern Medical Center, Dallas, TX 75390, USA
Y
Ying Zhang
The Medical Artificial Intelligence and Automation (MAIA) Laboratory, Department of Radiation Oncology, University of Texas Southwestern Medical Center, Dallas, TX 75390, USA
Jie Deng
Jie Deng
Professor, University of Pennsylvania
lymphedemasymptom managementcancer survivorshiponcology nursing
W
Weiguo Lu
The Medical Artificial Intelligence and Automation (MAIA) Laboratory, Department of Radiation Oncology, University of Texas Southwestern Medical Center, Dallas, TX 75390, USA
Y
You Zhang
The Medical Artificial Intelligence and Automation (MAIA) Laboratory, Department of Radiation Oncology, University of Texas Southwestern Medical Center, Dallas, TX 75390, USA