For farmers, policymakers, and government agencies, it is critical to accurately define agricultural crop phenology and its spatial-temporal variability. At the moment, two approaches are utilized to report crop phenology. On one hand, land surface phenology provides information about the overall trend, whereas weekly reports from USDA-NASS provide information about the development of particular crops at the regional level. High-cadence earth observations might help to improve the accuracy of these estimations and bring more precise crop phenology classifications closer to what farmers demand. The second component of the proposed solution requires the use of robust classifiers (e.g., random forest, RF) capable of successfully managing large data sets. To evaluate this solution, this study compared the output of a RF classifier model using weather, two different satellite sources (Planet Fusion; PF and Sentinel-2; S-2), and ground truth data to improve maize (Zea mays L.) crop phenology classification using two regions of Kansas (Southwest and Central) as a testbed during the 2017 growing season. Our findings suggests that high temporal resolution (PF) data can significantly improve crop classification metrics (f1-score = 0.94) relative to S-2 (f1-score = 0.86). Additionally, a decline in the f1-score between 0.74 and 0.60 was obtained when we assessed the ability of S-2 to extend the temporal forecast for crop phenology. This research highlights the critical nature of very high temporal resolution (daily) earth observation data for crop monitoring and decision making in agriculture.
more »
« less
An integrated approach of field, weather, and satellite data for monitoring maize phenology
Abstract Efficient, more accurate reporting of maize ( Zea mays L.) phenology, crop condition, and progress is crucial for agronomists and policy makers. Integration of satellite imagery with machine learning models has shown great potential to improve crop classification and facilitate in-season phenological reports. However, crop phenology classification precision must be substantially improved to transform data into actionable management decisions for farmers and agronomists. An integrated approach utilizing ground truth field data for maize crop phenology (2013–2018 seasons), satellite imagery (Landsat 8), and weather data was explored with the following objectives: (i) model training and validation—identify the best combination of spectral bands, vegetation indices (VIs), weather parameters, geolocation, and ground truth data, resulting in a model with the highest accuracy across years at each season segment (step one) and (ii) model testing—post-selection model performance evaluation for each phenology class with unseen data (hold-out cross-validation) (step two). The best model performance for classifying maize phenology was documented when VIs (NDVI, EVI, GCVI, NDWI, GVMI) and vapor pressure deficit (VPD) were used as input variables. This study supports the integration of field ground truth, satellite imagery, and weather data to classify maize crop phenology, thereby facilitating foundational decision making and agricultural interventions for the different members of the agricultural chain.
more »
« less
- Award ID(s):
- 1715894
- PAR ID:
- 10341612
- Date Published:
- Journal Name:
- Scientific Reports
- Volume:
- 11
- Issue:
- 1
- ISSN:
- 2045-2322
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Detecting crop phenology with satellite time series is important to characterize agroecosystem energy-water-carbon fluxes, manage farming practices, and predict crop yields. Despite the advances in satellite-based crop phenological retrievals, interpreting those retrieval characteristics in the context of on-the-ground crop phenological events remains a long-standing hurdle. Over the recent years, the emergence of near-surface phenology cameras (e.g., PhenoCams), along with the satellite imagery of both high spatial and temporal resolutions (e.g., PlanetScope imagery), has largely facilitated direct comparisons of retrieved characteristics to visually observed crop stages for phenological interpretation and validation. The goal of this study is to systematically assess near-surface PhenoCams and high-resolution PlanetScope time series in reconciling sensor- and ground-based crop phenological characterizations. With two critical crop stages (i.e., crop emergence and maturity stages) as an example, we retrieved diverse phenological characteristics from both PhenoCam and PlanetScope imagery for a range of agricultural sites across the United States. The results showed that the curvature-based Greenup and Gu-based Upturn estimates showed good congruence with the visually observed crop emergence stage (RMSE about 1 week, bias about 0–9 days, and R square about 0.65–0.75). The threshold- and derivative-based End of greenness falling Season (i.e., EOS) estimates reconciled well with visual crop maturity observations (RMSE about 5–10 days, bias about 0–8 days, and R square about 0.6–0.75). The concordance among PlanetScope, PhenoCam, and visual phenology demonstrated the potential to interpret the fine-scale sensor-derived phenological characteristics in the context of physiologically well-characterized crop phenological events, which paved the way to develop formal protocols for bridging ground-satellite phenological characterization.more » « less
-
The rapid growth in population, climate variability, and decreasing water resources necessitate innovative agricultural practices to ensure food security and resource conservation. This study investigates the effectiveness of various multispectral imagery from remote sensing (RS) platforms, Unmanned Aerial Systems (UAS), PlanetDove microsatellites, Sentinel-2, Landsat 8/9, and proximal MSR-5 in assessing crop biophysical characteristics (CBPCs) and actual crop evapotranspiration (ETa) for maize fields in northeastern Colorado. The research aims to evaluate the accuracy of vegetation indices (VIs) derived from these platforms in estimating key CBPCs, including leaf area index (LAI), crop height (Hc), and fractional vegetation cover (Fc), as well as ETa. Field experiments were conducted during 2022 at the USDA-ARS Limited Irrigation Research Farm in Greeley, Colorado, U.S.A., using different irrigation strategies. Surface reflectance data collected using a handled sensor and observed LAI, Hc, and Fc values, served as ground truth for validating RS estimates. The study applied various statistical analyses to compare the performance of different RS platforms and models. Results indicate that higher-resolution platforms, particularly UAS, provided higher accuracy in estimating VIs and CBPCs than satellite platforms. The study also highlights the influence of environmental conditions on the accuracy of RS models, with locally calibrated models outperforming those developed in dissimilar conditions. The findings underscore the potential of advanced RS technologies in enhancing precision agriculture practices and optimizing water resource management.more » « less
-
Accurate and timely crop mapping is essential for yield estimation, insurance claims, and conservation efforts. Over the years, many successful machine learning models for crop mapping have been developed that use just the multispectral imagery from satellites to predict crop type over the area of interest. However, these traditional methods do not account for the physical processes that govern crop growth. At a high level, crop growth can be envisioned as physical parameters, such as weather and soil type, acting upon the plant, leading to crop growth, which can be observed via satellites. In this paper, we propose a Weather-based Spatio-Temporal segmentation network with ATTention (WSTATT), a deep learning model that leverages this understanding of crop growth by formulating it as an inverse model that combines weather (Daymet) and satellite imagery (Sentinel-2) to generate accurate crop maps. We show that our approach provides significant improvements over existing algorithms that solely rely on spectral imagery by comparing segmentation maps and F1 classification scores. Furthermore, effective use of attention in WSTATT architecture enables the detection of crop types earlier in the season (up to 5 months in advance), which is very useful for improving food supply projections. We finally discuss the impact of weather by correlating our results with crop phenology to show that WSTATT is able to capture the physical properties of crop growth.more » « less
-
Shekhar, Shashi; Papalexakis, Vagelis; Gao, Jing; Jiang, Zhe; Riondato, Matteo (Ed.)Accurate and timely crop mapping is essential for yield estimation, insurance claims, and conservation efforts. Over the years, many successful machine learning models for crop mapping have been developed that use just the multispectral imagery from satellites to predict crop type over the area of interest. However, these traditional methods do not account for the physical processes that govern crop growth. At a high level, crop growth can be envisioned as physical parameters, such as weather and soil type, acting upon the plant, leading to crop growth which can be observed via satellites. In this paper, we propose a weather-based Spatio-Temporal segmentation network with ATTention (WSTATT), a deep learning model that leverages this understanding of crop growth by formulating it as an inverse model that combines weather (Daymet) and satellite imagery (Sentinel-2) to generate accurate crop maps. We show that our approach provides significant improvements over existing algorithms that solely rely on spectral imagery by comparing segmentation maps and F1 classification scores. Furthermore, effective use of attention in WSTATT architecture enables the detection of crop types earlier in the season (up to 5 months in advance), which is very useful for improving food supply projections. We finally discuss the impact of weather by correlating our results with crop phenology to show that WSTmore » « less
An official website of the United States government

