DUSTrack: Semi-automated point tracking in ultrasound videos

📅 2025-07-18
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
B-mode ultrasound video analysis faces challenges in tissue motion tracking due to speckle noise, low edge contrast, and out-of-plane motion, impeding reliable temporal tracking of anatomical landmarks. To address this, we propose a deep learning–optical flow fusion framework for semi-automatic tracking. Our method introduces a novel optical flow filter that suppresses high-frequency noise while preserving rapid motion features; incorporates a semi-supervised learning strategy with iterative model refinement to enhance generalizability and robustness; and integrates a graphical user interface enabling real-time interactive correction. The framework is validated across diverse applications—including myocardial wall deformation, skeletal muscle strain, and fascicle tracking—demonstrating superior accuracy over zero-shot approaches and performance comparable to task-specific models. An open-source implementation is publicly available, supporting both clinical assessment and biomechanical research.

Technology Category

Application Category

📝 Abstract
Ultrasound technology enables safe, non-invasive imaging of dynamic tissue behavior, making it a valuable tool in medicine, biomechanics, and sports science. However, accurately tracking tissue motion in B-mode ultrasound remains challenging due to speckle noise, low edge contrast, and out-of-plane movement. These challenges complicate the task of tracking anatomical landmarks over time, which is essential for quantifying tissue dynamics in many clinical and research applications. This manuscript introduces DUSTrack (Deep learning and optical flow-based toolkit for UltraSound Tracking), a semi-automated framework for tracking arbitrary points in B-mode ultrasound videos. We combine deep learning with optical flow to deliver high-quality and robust tracking across diverse anatomical structures and motion patterns. The toolkit includes a graphical user interface that streamlines the generation of high-quality training data and supports iterative model refinement. It also implements a novel optical-flow-based filtering technique that reduces high-frequency frame-to-frame noise while preserving rapid tissue motion. DUSTrack demonstrates superior accuracy compared to contemporary zero-shot point trackers and performs on par with specialized methods, establishing its potential as a general and foundational tool for clinical and biomechanical research. We demonstrate DUSTrack's versatility through three use cases: cardiac wall motion tracking in echocardiograms, muscle deformation analysis during reaching tasks, and fascicle tracking during ankle plantarflexion. As an open-source solution, DUSTrack offers a powerful, flexible framework for point tracking to quantify tissue motion from ultrasound videos. DUSTrack is available at https://github.com/praneethnamburi/DUSTrack.
Problem

Research questions and friction points this paper is trying to address.

Track tissue motion in B-mode ultrasound despite noise
Overcome low edge contrast and out-of-plane movement challenges
Provide accurate anatomical landmark tracking for clinical research
Innovation

Methods, ideas, or system contributions that make the work stand out.

Combines deep learning with optical flow
Includes GUI for training data generation
Uses optical-flow-based noise filtering
🔎 Similar Papers
No similar papers found.
Praneeth Namburi
Praneeth Namburi
MIT.nano Immersion Lab, MIT
NeuroscienceBiomechanics
R
Roger Pallarès-López
Department of Mechanical Engineering, MIT; Cambridge, MA 02139, USA.
J
Jessica Rosendorf
Department of Mechanical Engineering, MIT; Cambridge, MA 02139, USA.
Duarte Folgado
Duarte Folgado
Fraunhofer Portugal
Time SeriesSignal ProcessingMachine LearningExplainable AIBiomedical Engineering
B
Brian W. Anthony
Institute for Medical Engineering and Science, MIT; Cambridge, MA 02139, USA.; MIT.nano Immersion Lab, MIT; Cambridge, MA 02139, USA.; Department of Mechanical Engineering, MIT; Cambridge, MA 02139, USA.