🤖 AI Summary
This study addresses the challenge of efficiently monitoring individual-level plant phenology and plant–animal interactions in tropical ecosystems. Deploying low-cost, animal-triggered camera traps in a Hawaiian cloud forest, the work introduces a novel approach that integrates vision foundation models with traditional computer vision techniques to extract high-temporal-resolution, individual-scale phenological data without supervised learning, while simultaneously capturing plant–animal interaction events. By overcoming the limitations of conventional coarse-grained sampling, this method not only yields finer-grained phenological dynamics but also uncovers potential drivers linking plant phenophases to animal visitation behavior. The framework establishes a scalable new paradigm for monitoring tropical ecosystems, enabling more nuanced ecological insights through automated, continuous visual observation.
📝 Abstract
Plant phenology, the study of cyclical events such as leafing out, flowering, or fruiting, has wide ecological impacts but is broadly understudied, especially in the tropics. Image analysis has greatly enhanced remote phenological monitoring, yet capturing phenology at the individual level remains challenging. In this project, we deployed low-cost, animal-triggered camera traps at the Pu'u Maka'ala Natural Area Reserve in Hawaii to simultaneously document shifts in plant phenology and flora-faunal interactions. Using a combination of foundation vision models and traditional computer vision methods, we measure phenological trends from images comparable to on-the-ground observations without relying on supervised learning techniques. These temporally fine-grained phenology measurements from camera-trap images uncover trends that coarser traditional sampling fails to detect. When combined with detailed visitation data detected from images, these trends can begin to elucidate drivers of both plant phenology and animal ecology.