🤖 AI Summary
Traditional methods struggle to accurately and cost-effectively estimate aboveground biomass (AGB) in dense forests. To address this, we propose the first end-to-end deep learning framework for dense AGB prediction from a single ground-level RGB image. Our key innovation is formulating pixel-wise AGB density maps as the regression target. We synthesize a large-scale 3D forest dataset—SPREAD—by leveraging instance segmentation masks and tree attributes (e.g., species, diameter at breast height) to generate pixel-level AGB ground truth via allometric equations. Critically, our method requires neither LiDAR nor multi-view imagery, greatly enhancing scalability and enabling broad public participation. Experiments demonstrate strong generalization: median absolute error is 1.22 kg/m² on the held-out SPREAD test set and 1.94 kg/m² on real-world forest images, confirming both accuracy and practical applicability.
📝 Abstract
Forests play a critical role in global ecosystems by supporting biodiversity and mitigating climate change via carbon sequestration. Accurate aboveground biomass (AGB) estimation is essential for assessing carbon storage and wildfire fuel loads, yet traditional methods rely on labor-intensive field measurements or remote sensing approaches with significant limitations in dense vegetation. In this work, we propose a novel learning-based method for estimating AGB from a single ground-based RGB image. We frame this as a dense prediction task, introducing AGB density maps, where each pixel represents tree biomass normalized by the plot area and each tree's image area. We leverage the recently introduced synthetic 3D SPREAD dataset, which provides realistic forest scenes with per-image tree attributes (height, trunk and canopy diameter) and instance segmentation masks. Using these assets, we compute AGB via allometric equations and train a model to predict AGB density maps, integrating them to recover the AGB estimate for the captured scene. Our approach achieves a median AGB estimation error of 1.22 kg/m^2 on held-out SPREAD data and 1.94 kg/m^2 on a real-image dataset. To our knowledge, this is the first method to estimate aboveground biomass directly from a single RGB image, opening up the possibility for a scalable, interpretable, and cost-effective solution for forest monitoring, while also enabling broader participation through citizen science initiatives.