This is a Preprint and has not been peer reviewed. The published version of this Preprint is available: https://doi.org/10.1016/j.rse.2023.113800. This is version 1 of this Preprint.
Downloads
Authors
Abstract
Information on crop phenology is essential when aiming to better understand the impacts of climate and climate change, management practices, and environmental conditions on agricultural production. Today’s novel optical and radar satellite data with increasing spatial and temporal resolution provide great opportunities to provide such information. However, so far, we largely lack methods that leverage this data to provide detailed information on crop phenology at the field level. We here propose a method based on dense time series from Sentinel-1, Sentinel 2, and Landsat 8 to detect the start of seven phenological stages of winter wheat from seeding to harvest. We built different feature sets from these input data and compared their performance for training a one-dimensional temporal U-Net. The model was evaluated using a comprehensive reference data set from a national phenology network covering 16,000 field observations from 2017 to 2020 for winter wheat in Germany and compared against a baseline set by a Random Forest model.
Our results show that optical and radar data are differently well suited for the detection of the different stages due to their unique characteristics in signal processing. The combination of both data types showed the best results with 50.1% to 65.5% of phenological stages being predicted with an absolute error of less than six days. Especially late stages can be predicted well with, e.g., a coefficient of determination (R²) between 0.51 and 0.62 for harvest, while earlier stages like stem elongation remain a challenge (R² between 0.06 and 0.28). Moreover, our results indicate that meteorological data have comparatively low explanatory potential for fine-scale phenological developments of winter wheat.
Overall, our results demonstrate the potential of dense satellite image time series from Sentinel and Landsat sensor constellations in combination with the versatility of deep learning models for determining phenological timing.
DOI
https://doi.org/10.31223/X53X11
Subjects
Agricultural Science, Agriculture, Artificial Intelligence and Robotics, Environmental Monitoring
Keywords
agriculture, crop monitoring, Convolutional Neural Networks, U-Net, Multisensor, Data Fusion
Dates
Published: 2023-07-13 12:28
License
CC-By Attribution-NonCommercial-NoDerivatives 4.0 International
Additional Metadata
Conflict of interest statement:
None
There are no comments or no comments have been made public for this article.