Next Article in Journal / Special Issue
Comparison of Vegetation Indices Derived from UAV Data for Differentiation of Tillage Effects in Agriculture
Previous Article in Journal
Geographic and Climatic Attributions of Autumn Land Surface Phenology Spatial Patterns in the Temperate Deciduous Broadleaf Forest of China
Previous Article in Special Issue
UAV and Ground Image-Based Phenotyping: A Proof of Concept with Durum Wheat
 
 
Article
Peer-Review Record

Evaluation of Vegetation Biophysical Variables Time Series Derived from Synthetic Sentinel-2 Images

by Najib Djamai *, Detang Zhong, Richard Fernandes and Fuqun Zhou
Reviewer 1:
Reviewer 2: Anonymous
Reviewer 3: Anonymous
Reviewer 4: Anonymous
Submission received: 22 May 2019 / Revised: 19 June 2019 / Accepted: 26 June 2019 / Published: 29 June 2019

Round 1

Reviewer 1 Report

This paper presents a method to predict the missing (cloudy) images at

 relatively high temporal and spatial resolution by fusing the Sentinel-2 and MODIS data. And then the predicted images were used to retrieve five vegetation biophysical variables. The method was evaluated at both reflectance and vegetation parameter levels. As the sub-weekly high-resolution vegetation products are still required by the global climate observing system, this work should be interested to the readers. The core of this method is the PSRFM model and SL2P process that all have been reported before, which may reduce the novelty of this work. From my point of view, this work is an application of the PSRFM and SL2P. In addition, the title of this paper -"evaluation of..." shows that the paper should mainly focuses on the evaluation. However, this part in current version is a little bit weak. Above all, I suggest a major revision that should: 


1)  put some figures showing the time series curves of both predicted reflectance and vegetation variables; 

2) compare the derived high temporal and spatial resolution data with MODIS products (for example Xiao, Z.; Liang, S.; Wang, J.; Chen, P.; Yin, X.; Zhang, L.; Song, J. Use of general regression neural networks for generating the GLASS leaf area index product from time-series MODIS surface reflectance. IEEE Trans. Geosci. Remote Sens. 2014, 52, 209–223.

Yan, K.; Park, T.; Yan, G.; Chen, C.; Yang, B.; Liu, Z.; Nemani, R.; Knyazikhin, Y.; Myneni, R. Evaluation of MODIS LAI/FPAR product Collection 6. Part 1: Consistency and improvements. Remote Sens. 2016, 8, 359.

Baret, F.; Weiss, M.; Lacaze, R.; Camacho, F.; Makhmara, H.; Pacholcyzk, P.; Smets, B. GEOV1: LAI and FAPAR essential climate variables and FCOVER global time series capitalizing over existing products. Part1: Principles of development and production. Remote Sens. Environ. 2013, 137, 299–309.) by an upscaling work; 

3) the S2-MSI image that is used to validated the predicted S2-LIKE image should not be used in the PSFRM- these two images should be totally independent.




Some minor problems:


1. Initial upper case should be for all abbreviations (P52 Leaf Area Index (LAI))

2. In Fig.4, the histogram should have Y-axis showing the percentage.

3. In Fig5. (a), should be LAI [Unitless], not LAI [-]

4. Some figures are not clear, see Fig.6

5. Fitting lines should be put in Fig.6


Author Response


Dear Reviewer,

Please find inclosed my responses to your comments concerning the paper referenced above.

I greatly appreciate your helpful comments, which certainly contribute to the improvement of the paper.

 

Thank you very much.

 


Author Response File: Author Response.docx

Reviewer 2 Report

Please find my comments from the attachment.

Comments for author File: Comments.pdf

Author Response


Dear Reviewer,

Please find enclosed my responses to your comments concerning the paper referenced above.

I greatly appreciate your helpful comments, which certainly contribute to the improvement of the paper.

 

Thank you very much.

 



Author Response File: Author Response.docx

Reviewer 3 Report

This work addresses two research questions:

1. Can we retrieve vegetation biophysical variables from synthetic S2-LIKE images with sub-weekly frequency in mid-latitude regions?

And 2. What is the uncertainty of vegetation biophysical variables estimates from S2-LIKE images compared to estimates from S2-MSI images and in-situ data and how does the uncertainty change with temporal gap size.

The paper is clearly presented and is supported by the data and the techniques used. The results seem to be reasonable and the method seems to be consistent.

 

Some comments and suggestions:

 

L. 56 - their precision is   --- their accuracy is

 

L.203 - training sue to  --- training due to

 

L.346 – Figure 7  --- It should be improved.

 

L.364 - the error is still meets the  ----  the error still meets the

 

L.423 - the low precision  ----  the low accuracy

 

L.429 - evaluate the precision  ----  evaluate the accuracy

 

L. 439 - The objectives of this research were to determine if sub-weekly vegetation biophysical parameter products can be derived using Sentinel 2 imagery and to validate these derived products.  --- can be removed from the text.

 

L. 549 - ESA Senitenl-2 Team --- ESA Sentinel-2 Team

 


Author Response

Dear Reviewer,

Please find enclosed my responses to your comments concerning the paper referenced above.

I greatly appreciate your helpful comments, which certainly contribute to the improvement of the paper.

 

Thank you very much.


Author Response File: Author Response.docx

Reviewer 4 Report

I have carefully reviewed the manuscript titled: “Evaluation of vegetation biophysical variables time series derived from synthetic Sentinel-2 images”.

 

There are two major issues related to the design of the study:


1) My greatest concern is that the practical interest in this study was just outdated; that is these results arrived at a wrong timing and cannot be considered a novelty. The authors are assessing whether a sub-weekly time series of vegetation biophysical variables is possible when we all know it is. Indeed, Sentinel-2B was launched on 7 March 2017, and it is known that aggregating Sentinel-2A and Sentinel-2B information gives us better spatio-temporal resolution up to 10m and 2 to 3 day. But, oddly enough, the authors came up with a recommendation to use synthetic S2-LIKE surface reflectance data, produced by blending clear-sky S2-MSI images from Sentinel 2A with daily BRDF-adjusted MODIS images, to provide sub-weekly time-series of vegetation biophysical variables at medium resolution (~20m).   


2) In addition, the study is based on a one-year data collected in 2016, which is a very short period of time. Two years are the minimum duration for agricultural investigations, and certainly most agro-ecological investigations derive the time series data from several years (e.g. five) of data gathering. But, this is a minor issue relative to the issues raised above.   

 

Author Response

Dear Reviewer,

Please find enclosed my responses to your comments concerning the paper referenced above.

I greatly appreciate your helpful comments, which certainly contribute to the improvement of the paper.

 

Thank you very much.


Author Response File: Author Response.docx

Round 2

Reviewer 1 Report

The new version of this paper improved a lot. I don't have more concerns about this paper and suggest it to be published.

Reviewer 2 Report

I read carefully the responses to comments as well as the revised version of the manuscript. The manuscript was significantly improved and all my comments were taken into account. I have no comments now. Thanks!

Reviewer 4 Report

I have completed the second review of the manuscript titled: “Evaluation of vegetation biophysical variables time series derived from synthetic Sentinel-2 images”.

 

Overall there was no improvements relative to previous concerns on the design of the study:

1) The study was based on data from one season (year 2016) while agricultural investigation requires a minimum of two years, and agroecological studies require even more spatio-temporal replications. As such, the findings and inferences produced in this work are undependable. The authors offered mixed and unconvincing explanations on this issue, invoking for instance the CEOS (Committee on Earth Observation Satellites) stage I. But even the CEOS stage I stated that: “Product accuracy is assessed from a small (typically < 30) set of locations and time periods by comparison with in-situ or other suitable reference data.” In this study, validation was done on data collected on three dates (in one cropping season) spanning less than 40 days from leaf emergence (meaning the they missed the groth cycle for all of the crop species evaluated). We know for instance that corn (one of the crop species included in this study, requires from 60 to 100 days to reach harvest depending on the variety and warm weather.   

2) The study included 37 data sampling units and several (7) crop species including canola, wheat, oats ,alfalfa, corn, beans, soybean, with different morpho-physiological properties and different photosynthetic pathways (there were C4 and C3 crop species). Implementing predictive analytics in such a crop species mixture with very few datapoints collected from one season, cannot be recommended.

3) My other concern remains that the practical interest in this study was just outdated: the findings in this work are no better than what the Sentinel-2constellation can offer. Indeed, aggregating Sentinel-2A and Sentinel-2B information gives us better spatio-temporal resolution up to 10m and 2 to 3 days in comparison with the authors’ recommendation to use synthetic S2-LIKE surface reflectance data, produced by blending clear-sky S2-MSI images from Sentinel 2A with daily BRDF-adjusted MODIS images, providing sub-weekly time-series of vegetation biophysical variables at medium resolution (~20m).   

4) There are other in-text issues but the above are the major concerns on this manuscript 

5) My suggestion to the authors is to re-run the experiment with current satellite remote sensing tools, and then operate multi-source data fusion and make their case on potential improvements upon the existing crop monitoring technologies.

Back to TopTop