🤖 AI Summary
To address the time-consuming intraoperative pathological diagnosis and expert dependency in Mohs micrographic surgery (MMS), this study proposes the first end-to-end deep learning framework tailored for Mohs frozen sections, jointly performing basal cell carcinoma (BCC) segmentation and artifact detection. Methodologically, it innovatively integrates U-Net-based semantic segmentation with an ensemble classifier: segmentation accuracy is enhanced via sliding-window tiling and Dice loss optimization; subsequently, features derived from segmentation masks are fed into an AUC-optimized, multi-model ensemble to improve discriminative robustness. This work is the first to jointly model tumor (BCC) and non-tumor (including artifacts) semantics on Mohs sections. Experiments demonstrate Dice scores of 0.70 (tumor) and 0.67 (non-tumor), with corresponding AUCs of 0.98 and 0.96. Patch-level and whole-slide BCC detection achieve AUCs of 0.98 and 0.91, respectively—significantly improving intraoperative decision-making efficiency and clinical accessibility.
📝 Abstract
Mohs micrographic surgery (MMS) is the gold standard technique for removing high risk nonmelanoma skin cancer however, intraoperative histopathological examination demands significant time, effort, and professionality. The objective of this study is to develop a deep learning model to detect basal cell carcinoma (BCC) and artifacts on Mohs slides. A total of 731 Mohs slides from 51 patients with BCCs were used in this study, with 91 containing tumor and 640 without tumor which was defined as non-tumor. The dataset was employed to train U-Net based models that segment tumor and non-tumor regions on the slides. The segmented patches were classified as tumor, or non-tumor to produce predictions for whole slide images (WSIs). For the segmentation phase, the deep learning model success was measured using a Dice score with 0.70 and 0.67 value, area under the curve (AUC) score with 0.98 and 0.96 for tumor and non-tumor, respectively. For the tumor classification, an AUC of 0.98 for patch-based detection, and AUC of 0.91 for slide-based detection was obtained on the test dataset. We present an AI system that can detect tumors and non-tumors in Mohs slides with high success. Deep learning can aid Mohs surgeons and dermatopathologists in making more accurate decisions.