🤖 AI Summary
To address the challenge of real-time weed detection in agricultural fields under stringent resource constraints on edge devices, this paper proposes a lightweight YOLO-based detection framework optimized for the Jetson Orin Nano. While preserving core architectural elements—including residual connections, attention mechanisms, and Cross-Stage Partial (CSP) structures—the framework integrates structured channel pruning, quantization-aware training (QAT), and TensorRT-based inference optimization. The resulting model achieves a 68.5% parameter reduction and a 3.2 GFLOPs decrease in computational complexity. Deployed in FP16 precision, it attains 184 FPS inference speed with an mAP₅₀ of 85.9%, significantly outperforming YOLO11n and YOLO12n. This work delivers a balanced trade-off between accuracy and efficiency, providing a deployable end-to-end visual perception solution for precision agriculture on resource-constrained edge platforms.
📝 Abstract
Deploying deep learning models in agriculture is difficult because edge devices have limited resources, but this work presents a compressed version of EcoWeedNet using structured channel pruning, quantization-aware training (QAT), and acceleration with NVIDIA's TensorRT on the Jetson Orin Nano. Despite the challenges of pruning complex architectures with residual shortcuts, attention mechanisms, concatenations, and CSP blocks, the model size was reduced by up to 68.5% and computations by 3.2 GFLOPs, while inference speed reached 184 FPS at FP16, 28.7% faster than the baseline. On the CottonWeedDet12 dataset, the pruned EcoWeedNet with a 39.5% pruning ratio outperformed YOLO11n and YOLO12n (with only 20% pruning), achieving 83.7% precision, 77.5% recall, and 85.9% mAP50, proving it to be both efficient and effective for precision agriculture.