Robotic Multimodal Data Acquisition for In-Field Deep Learning Estimation of Cover Crop Biomass

📅 2025-06-27
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Accurate estimation of cover crop (CC) aboveground biomass (AGB) and its spatial distribution is critical for identifying weed-suppression gaps and enabling variable-rate management. This paper proposes a multimodal sensing and deep learning fusion framework deployed on a ground-based robotic platform, which simultaneously acquires optical imagery and LiDAR-derived 3D point cloud data to construct an end-to-end AGB regression model. Its key innovation lies in the first field-deployed, dynamic-scenario implementation of cross-modal collaborative modeling—jointly leveraging optical and geometric features—to significantly enhance robustness in AGB retrieval under complex canopy structures. The model achieves an R² of 0.88 and maintains consistent predictive performance across multiple fields and phenological stages. The resulting centimeter-resolution AGB spatial maps enable precise identification of high-weed-risk zones and support targeted interventions, advancing CC management toward high-resolution, automated, and sustainable practices.

Technology Category

Application Category

📝 Abstract
Accurate weed management is essential for mitigating significant crop yield losses, necessitating effective weed suppression strategies in agricultural systems. Integrating cover crops (CC) offers multiple benefits, including soil erosion reduction, weed suppression, decreased nitrogen requirements, and enhanced carbon sequestration, all of which are closely tied to the aboveground biomass (AGB) they produce. However, biomass production varies significantly due to microsite variability, making accurate estimation and mapping essential for identifying zones of poor weed suppression and optimizing targeted management strategies. To address this challenge, developing a comprehensive CC map, including its AGB distribution, will enable informed decision-making regarding weed control methods and optimal application rates. Manual visual inspection is impractical and labor-intensive, especially given the extensive field size and the wide diversity and variation of weed species and sizes. In this context, optical imagery and Light Detection and Ranging (LiDAR) data are two prominent sources with unique characteristics that enhance AGB estimation. This study introduces a ground robot-mounted multimodal sensor system designed for agricultural field mapping. The system integrates optical and LiDAR data, leveraging machine learning (ML) methods for data fusion to improve biomass predictions. The best ML-based model for dry AGB estimation achieved a coefficient of determination value of 0.88, demonstrating robust performance in diverse field conditions. This approach offers valuable insights for site-specific management, enabling precise weed suppression strategies and promoting sustainable farming practices.
Problem

Research questions and friction points this paper is trying to address.

Estimating cover crop biomass accurately for weed management
Mapping biomass distribution to optimize agricultural strategies
Fusing optical and LiDAR data with machine learning
Innovation

Methods, ideas, or system contributions that make the work stand out.

Robot-mounted multimodal sensor system
Optical and LiDAR data fusion
Machine learning for biomass estimation
🔎 Similar Papers
No similar papers found.
Joe Johnson
Joe Johnson
Researcher, Texas A&M University
Field RoboticsRobotic PerceptionPrecision AgricultureMachine Learning
P
Phanender Chalasani
Department of Computer Science and Engineering, Texas A&M University, College Station, TX 77840, USA
Arnav Shah
Arnav Shah
University of Toronto, Vector Institute
deep learningdrug discovery
R
Ram L. Ray
College of Agriculture, Food and Natural Resources, Prairie View A&M University, Prairie View, TX 77446, USA
Muthukumar Bagavathiannan
Muthukumar Bagavathiannan
Department of Soil and Crop Sciences, Texas A&M University, College Station, TX 77840, USA