Real-Time Threaded Houbara Detection and Segmentation for Wildlife Conservation using Mobile Platforms

📅 2025-10-03
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the challenge of real-time detection and segmentation of the critically endangered great bustard in natural habitats—where its high camouflage capability and constrained on-device computational resources hinder practical deployment—this paper proposes a mobile-oriented two-stage lightweight framework. It employs YOLOv10 for high-accuracy detection and MobileSAM for fine-grained segmentation, integrated via a novel multi-threaded parallel inference mechanism that jointly optimizes both stages. Evaluated on a custom dataset of 40,000 images, the framework achieves a detection mAP₅₀ of 0.9627 and a segmentation mIoU of 0.7421, with an end-to-end latency of only 43.7 ms per frame. To our knowledge, this is the first work to enable efficient, real-time, end-to-end perception of highly camouflaged species on mobile platforms. The proposed solution provides a deployable, lightweight foundation for intelligent monitoring of endangered wildlife in field applications.

Technology Category

Application Category

📝 Abstract
Real-time animal detection and segmentation in natural environments are vital for wildlife conservation, enabling non-invasive monitoring through remote camera streams. However, these tasks remain challenging due to limited computational resources and the cryptic appearance of many species. We propose a mobile-optimized two-stage deep learning framework that integrates a Threading Detection Model (TDM) to parallelize YOLOv10-based detection and MobileSAM-based segmentation. Unlike prior YOLO+SAM pipelines, our approach improves real-time performance by reducing latency through threading. YOLOv10 handles detection while MobileSAM performs lightweight segmentation, both executed concurrently for efficient resource use. On the cryptic Houbara Bustard, a conservation-priority species, our model achieves mAP50 of 0.9627, mAP75 of 0.7731, mAP95 of 0.7178, and a MobileSAM mIoU of 0.7421. YOLOv10 operates at 43.7 ms per frame, confirming real-time readiness. We introduce a curated Houbara dataset of 40,000 annotated images to support model training and evaluation across diverse conditions. The code and dataset used in this study are publicly available on GitHub at https://github.com/LyesSaadSaoud/mobile-houbara-detseg. For interactive demos and additional resources, visit https://lyessaadsaoud.github.io/LyesSaadSaoud-Threaded-YOLO-SAM-Houbara.
Problem

Research questions and friction points this paper is trying to address.

Real-time detection of cryptic Houbara Bustard in natural environments
Mobile-optimized animal segmentation with limited computational resources
Parallelizing detection and segmentation for wildlife conservation monitoring
Innovation

Methods, ideas, or system contributions that make the work stand out.

Mobile-optimized two-stage deep learning framework
Threading Detection Model parallelizes YOLOv10 and MobileSAM
Concurrent execution reduces latency for real-time performance
🔎 Similar Papers
No similar papers found.
L
Lyes Saad Saoud
Khalifa University Center for Autonomous Robotic Systems (KUCARS), Khalifa University, Abu Dhabi, United Arab Emirates.
L
Loic Lesobre
RENECO International Wildlife Consultants LLC, Abu Dhabi, United Arab Emirates.
E
Enrico Sorato
RENECO International Wildlife Consultants LLC, Abu Dhabi, United Arab Emirates.
Irfan Hussain
Irfan Hussain
Assistant Professor Khalifa University.
GraspingMechatronicsRehabilitationProsthesis