🤖 AI Summary
Traditional fetal movement (FM) assessment suffers from subjectivity and low accuracy. To address this, we propose an automated FM recognition framework based on self-supervised contrastive representation learning. Our method introduces a dual-contrastive loss that jointly models spatial and temporal features, augmented by a task-specific sampling strategy to enable robust representation learning from variable-length ultrasound videos; discriminative performance is further enhanced via probabilistic fine-tuning. Evaluated on a clinical dataset comprising 92 thirty-minute ultrasound video recordings, the model achieves 78.01% sensitivity and 81.60% AUROC—significantly outperforming existing approaches. This work represents the first application of spatiotemporal dual-contrastive learning to fetal motion analysis, establishing a novel, label-efficient paradigm for objective and interpretable FM assessment in unlabeled ultrasound videos, with clear translational potential for clinical practice.
📝 Abstract
Accurate fetal movement (FM) detection is essential for assessing prenatal health, as abnormal movement patterns can indicate underlying complications such as placental dysfunction or fetal distress. Traditional methods, including maternal perception and cardiotocography (CTG), suffer from subjectivity and limited accuracy. To address these challenges, we propose Contrastive Ultrasound Video Representation Learning (CURL), a novel self-supervised learning framework for FM detection from extended fetal ultrasound video recordings. Our approach leverages a dual-contrastive loss, incorporating both spatial and temporal contrastive learning, to learn robust motion representations. Additionally, we introduce a task-specific sampling strategy, ensuring the effective separation of movement and non-movement segments during self-supervised training, while enabling flexible inference on arbitrarily long ultrasound recordings through a probabilistic fine-tuning approach. Evaluated on an in-house dataset of 92 subjects, each with 30-minute ultrasound sessions, CURL achieves a sensitivity of 78.01% and an AUROC of 81.60%, demonstrating its potential for reliable and objective FM analysis. These results highlight the potential of self-supervised contrastive learning for fetal movement analysis, paving the way for improved prenatal monitoring and clinical decision-making.