🤖 AI Summary
Accurate estimation of cover crop (CC) aboveground biomass (AGB) and its spatial distribution is critical for identifying weed-suppression gaps and enabling variable-rate management. This paper proposes a multimodal sensing and deep learning fusion framework deployed on a ground-based robotic platform, which simultaneously acquires optical imagery and LiDAR-derived 3D point cloud data to construct an end-to-end AGB regression model. Its key innovation lies in the first field-deployed, dynamic-scenario implementation of cross-modal collaborative modeling—jointly leveraging optical and geometric features—to significantly enhance robustness in AGB retrieval under complex canopy structures. The model achieves an R² of 0.88 and maintains consistent predictive performance across multiple fields and phenological stages. The resulting centimeter-resolution AGB spatial maps enable precise identification of high-weed-risk zones and support targeted interventions, advancing CC management toward high-resolution, automated, and sustainable practices.
📝 Abstract
Accurate weed management is essential for mitigating significant crop yield losses, necessitating effective weed suppression strategies in agricultural systems. Integrating cover crops (CC) offers multiple benefits, including soil erosion reduction, weed suppression, decreased nitrogen requirements, and enhanced carbon sequestration, all of which are closely tied to the aboveground biomass (AGB) they produce. However, biomass production varies significantly due to microsite variability, making accurate estimation and mapping essential for identifying zones of poor weed suppression and optimizing targeted management strategies. To address this challenge, developing a comprehensive CC map, including its AGB distribution, will enable informed decision-making regarding weed control methods and optimal application rates. Manual visual inspection is impractical and labor-intensive, especially given the extensive field size and the wide diversity and variation of weed species and sizes. In this context, optical imagery and Light Detection and Ranging (LiDAR) data are two prominent sources with unique characteristics that enhance AGB estimation. This study introduces a ground robot-mounted multimodal sensor system designed for agricultural field mapping. The system integrates optical and LiDAR data, leveraging machine learning (ML) methods for data fusion to improve biomass predictions. The best ML-based model for dry AGB estimation achieved a coefficient of determination value of 0.88, demonstrating robust performance in diverse field conditions. This approach offers valuable insights for site-specific management, enabling precise weed suppression strategies and promoting sustainable farming practices.