Next Article in Journal
High-Precision Seedling Detection Model Based on Multi-Activation Layer and Depth-Separable Convolution Using Images Acquired by Drones
Next Article in Special Issue
A Framework for Soil Salinity Monitoring in Coastal Wetland Reclamation Areas Based on Combined Unmanned Aerial Vehicle (UAV) Data and Satellite Data
Previous Article in Journal
An Autonomous Control Framework of Unmanned Helicopter Operations for Low-Altitude Flight in Mountainous Terrains
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Developing Novel Rice Yield Index Using UAV Remote Sensing Imagery Fusion Technology

1
College of Biosystems Engineering and Food Science, Zhejiang University, Hangzhou 310058, China
2
College of Mechanical and Electrical Engineering, Xinjiang Agricultural University, Urumqi 830052, China
3
State Key Laboratory of Rice Biology, China National Rice Research Institute, Hangzhou 310006, China
4
State Key Laboratory of Fluid Power and Mechatronic Systems, Zhejiang University, Hangzhou 310058, China
*
Author to whom correspondence should be addressed.
Submission received: 13 May 2022 / Revised: 9 June 2022 / Accepted: 14 June 2022 / Published: 17 June 2022
(This article belongs to the Special Issue UAS in Smart Agriculture)

Abstract

:
Efficient and quick yield prediction is of great significance for ensuring world food security and crop breeding research. The rapid development of unmanned aerial vehicle (UAV) technology makes it more timely and accurate to monitor crops by remote sensing. The objective of this study was to explore the method of developing a novel yield index (YI) with wide adaptability for yield prediction by fusing vegetation indices (VIs), color indices (CIs), and texture indices (TIs) from UAV-based imagery. Six field experiments with 24 varieties of rice and 21 fertilization methods were carried out in three experimental stations in 2019 and 2020. The multispectral and RGB images of the rice canopy collected by the UAV platform were used to rebuild six new VIs and TIs. The performance of VI-based YI (MAPE = 13.98%) developed by quadratic nonlinear regression at the maturity stage was better than other stages, and outperformed that of CI-based (MAPE = 22.21%) and TI-based (MAPE = 18.60%). Then six VIs, six CIs, and six TIs were fused to build YI by multiple linear regression and random forest models. Compared with heading stage (R2 = 0.78, MAPE = 9.72%) and all stage (R2 = 0.59, MAPE = 22.21%), the best performance of YI was developed by random forest with fusing VIs + CIs + TIs at maturity stage (R2 = 0.84, MAPE = 7.86%). Our findings suggest that the novel YI proposed in this study has great potential in crop yield monitoring.

Graphical Abstract

1. Introduction

Rice, one of the most important food crops, is the staple food for about half of the global population. Rice yield prediction is of great significance for ensuring world food security and breeding new varieties with high yield and good stress resistance. Through a brief review of previous studies, it is known that crop yield estimation methods mainly include yield sampling survey, agronomic meteorological yield prediction, yield simulation based on crop biomass, and yield estimation based on remote sensing. A field sampling survey is the most commonly used method. Survey samples of 1 m × 1 m size are taken from each field at the maturity stage, and the average yield of all samples represents the measured value of rice yield after rice threshing and moisture measurement. The yield sampling survey method is time-consuming and labor-consuming, and high-cost for researchers to conduct large-scale field measurements. Agro-meteorological yield prediction method needs long-term agro-meteorological monitoring data, and the lack of multi-period crop growth parameters leads to unstable accuracy and high cost. The yield prediction mechanism of yield simulation based on crop biomass is clear. But the model needs many input parameters, and it is difficult to determine the best parameters due to varieties and fertilization treatments. Therefore, it is urgent to develop a fast and reliable technology for yield prediction. Due to the multi-temporal image acquisition ability and low economic cost, the development and application of remote sensing in crop monitoring is the focus of many researchers at present. Although ground-based platform sensors [1,2] are easy to operate and obtain accurate crop information, a small scanning area each time leads to low efficiency. Due to the limitations of weather conditions and resolution, satellite-based platform sensors [3] are difficult to meet the requirements of modern precision agriculture management. The convenience of unmanned aerial vehicle (UAV) and various supporting sensors enable UAV-based sensors [4,5,6] to overcome these limitations.
Generally speaking, there are two main methods for crop growth monitoring. A common method is to use image classification based on high-resolution RGB images to obtain crop parameters, such as vegetation coverage [7], biomass [8], and plant height [9]. For example, in a potato growth monitoring study, RGB and hyperspectral imaging data of potato crop canopy were obtained by UAV to estimate crop biomass and predict crop yield [10]. In oilseed rape research, the UAV equipped with RGB and multispectral sensors was used to acquire a series of field images at the flowering stage, and vegetation indices (VIs) extracted from images were used to predict the number of oilseed rape flowers [11]. In soybean research, based on the data obtained by RGB sensors, the ability to estimate soybean yield is evaluated in the deep learning framework [12]. However, the accuracy of the model was found to be dependent on the soybean varieties. Another possibility is to use vegetation index (VI) or color index (CI) extracted from RGB images and multispectral images to monitor crop status at the field scale, such as crop yield [13], aboveground biomass [14,15], and crop lodging identification [16]. For example, Zhou et al. [17] used single-stage VIs and multi-temporal VIs derived from the multispectral images to predict rice grain yield with R2 = 0.75 and RMSE = 947.69 kg·ha−1. Naito et al. [18] found that a high correlation exists between several VIs and multiple yield traits (panicle number, grain weight, and shoot biomass), and simple ratio indices exhibited better performance in regards to the estimation of grain weight (R2 = 0.80). Zhang et al. [19] proposed that the VI-based model accurately predicted the differences between treatments and grain yield (R2 > 0.7). However, it is noted that the experimental materials of previous rice yield estimation studies were rice with a few varieties and several different nitrogen gradient treatments. Moreover, the accuracy of yield prediction models is often easily affected by crop varieties and environmental conditions, resulting in low generalizability. Therefore, it is imperative to construct a widely adaptable yield index (YI) for rice yield prediction. At present, there is no relevant report on the yield index.
Extracting VIs and color indices (CIs) from multispectral and RGB images is a reliable method for agricultural crop monitoring [20]. The sensitivity of VIs and CIs changes in different growth stages of crops, so it is necessary to combine multiple indices to complete the monitoring task. Rice with different yields usually shows differences in color and plant density, which leads to changes in texture characteristics. Some scholars have found that combining spectrum and texture to complete biomass estimation could improve the accuracy [21]. The method of fusing VIs, CIs, and texture from UAV images needs further research.
Furthermore, a few people have noticed that the commonly used VIs and CIs are calculated from specific bands based on the experience of previous studies. For example, the normalized difference vegetation index (NDVI) is calculated by normalization in the near-infrared and red band. It performs well in predicting chlorophyll and nitrogen content, but not in yield. Different fertilizer treatments can affect crop canopy structure, which ultimately leads to changes in yield and texture features. Some studies have used texture features to predict forest and wheat biomass, and these studies show that the use of texture features can improve the prediction accuracy of the model. In addition, the commonly used textures are calculated from a single band. Few researchers have studied how to construct a new index with textures in different bands, just like NDVI. Therefore, it is necessary to propose a method to select the best band combination of VIs and construct a new texture feature to improve the prediction accuracy.
This research was aimed at exploring the method of developing a novel YI for predicting rice yield by fusing UAV-based VIs, CIs, and TIs. The main objectives were:(1) to calculate new VIs, CIs, and TIs suitable for rice yield prediction, and to analyze the quantitative relationship between yield and VI, CI, and TI; (2) to develop the YI, and compare the performance using different regression methods (i.e., MLR, QNR, RF); (3) to evaluate the prediction performance of YI by the data of 2019 and 2020.

2. Materials and Methods

2.1. Experimental Design

The data used in this study were obtained by UAV-based sensors from six experiments with 24 varieties of rice and 21 fertilization treatments in 232 plots in two years (Figure 1). The information on rice varieties and fertilizer treatments is summarized in Table 1. Experiments 1 and 2 were conducted at Fuyang Experimental Station (30′04″61 N, 119′55″27 E) of China National Rice Research Institute in Hangzhou, China in 2019. A completely random block design with three replicates was applied in Experiment 1 (24 plots), which included four rice varieties and two fertilizer treatments. The rice varieties for Experiment 2 (96 plots) were Zhongzao 39 and Zhongjiazao 17. The first area of Experiment 2 was designed with three replicates including two varieties and six fertilizer treatments. The second area of Experiment 2 was designed with five replicates including two varieties and six fertilizer treatments. Four other experiments were carried out in four fields of three experimental stations (Fuyang, Yuhang, and Pingyao) in Hangzhou, China in 2020. Experiment 3 (11 plots) was designed with three replicates, which included one rice variety and four fertilizer treatments. One of the plots failed to collect effective remote sensing data due to rice lodging. 19 varieties of rice were treated with the same fertilizer treatment in Experiment 4 (19 plots) without replicate. Experiment 5 included 28 plots with four rice varieties and seven fertilizer treatments. Experiment 6 (54 plots) was conducted with 11 replicates including one variety and five fertilizer treatments. A plot was abandoned due to lodging.

2.2. Image Acquisition and Processing

A compact multispectral camera mounted on DJI Phantom 4 Multispectral (Da Jiang Innovations, Inc., Shenzhen, Guangdong, China) was used to obtain multispectral images and RGB images at the heading and maturity stages (Table 1). DJI Phantom 4 Multispectral is a small quadrotor UAV connected to the camera by a three-axis stabilizer cradle. The multispectral camera consists of six 2.08-megapixel camera sensors (1600 × 1300), including five monochrome sensors (Blue (B) 450 ± 16 nm, Green (G) 560 ± 16 nm, Red (R) 650 ± 16 nm, Red edge (RE) 730 ± 16 nm, Near-infrared (NIR) 840 nm) ± 26 nm and a true-color camera sensor (visible light; R, G, B).
To prevent image distortion caused by weather conditions, all UAV flights were conducted from 11:00 am to 2:00 pm on sunny days, during which there was full sunlight and low wind speed. The altitude of each flight was 25 m, and the average flight speed was 2.5 m/s. The camera lens was vertical to the ground to collect orthographic images of experimental plots. The forward and side overlap were both maintained at 75% to ensure good image mosaic results. Four calibration boards (reflectance values:10%, 30%, 50% and 80%) were placed horizontally before each flight. Then the UAV hovered 1.5 m above the calibration boards to photograph them with the multispectral camera. These images were used for radiometric correction. The UAV flew over each experimental field according to planned routes and automatically acquired multispectral and RGB images at fixed intervals. Except for the infrared bandwidth of 26 nm, the bandwidth of the other four bands was 16 nm. The ground sampling distance of multispectral and RGB images is 0.013 m/pixel. All rice was sown in May and harvested in October, and the yield of all the plots was manually measured.
The original multispectral and RGB images obtained by the UAV platform sensor were processed for orthophoto mosaic and radiometric correction before data extraction. The main steps of image processing described above were completed in DJI Terra software (Da Jiang Innovations, Inc., Shenzhen, Guangdong, China). All images and corresponding reflectance values of the four correction boards were imported to DJI Terra Software, which automatically completed radiometric correction. Individual aerial images were calibrated and aligned in DJI Terra software for generating an orthophoto of each experimental field. With calibrated internal parameters and onboard geolocation, the georeferencing RMSE (that of original and calibrated locations) of block adjustment was 1.1 cm. Then R, G, B, RE and NIR bands of each experimental plot were extracted in ENVI 5.3 software (Exelis Visual Information Solutions, Boulder, CO, USA).

2.3. Feature Selection

VIs and CIs can be used to sensitively monitor crop growth. The number of rice panicles is closely related to rice yield, and the texture feature can indicate the distribution of rice leaves and panicles. The use of fusion features including texture may help improve prediction accuracy. The correlation between VIs, CIs, TIs, and yield was analyzed to select the best combination of input variables. Six VIs and six CIs were selected with reference to the commonly used in a series of published optical indices. However, these VIs which were calculated from specific bands based on the experience of previous studies may not have the best performance in yield prediction. The original band combinations of each VI were not adopted directly. Each new index was reconstructed by any two or four bands from all five bands according to the formula. Then the optimal combinations were selected by analyzing the correlation between VI and yield. While the CIs was calculated by the re-normalized r, g, and b bands with reference to previous studies in Table 2. The calculation method is shown in the following formulas (Table 2).
r = R/(R + G + B)
g = G/(R + G + B)
b = B/(R + G + B)
where R, G, and B represent the average digital number value (0–255) of red, green, and blue bands in RGB images, and r, g and b represent the represents the normalized values of these three bands.
The frequency distribution of two gray pixels with distance (Δx, Δy) in the image can be expressed by gray level co-occurrence matrix (GLCM), which is an effective method for texture feature extraction. Four GLCM-based textures, including homogeneity (HOM), contrast (CON), Energy (ENE), and correlation (COR), were calculated from five band images by MATLAB 2018 (MathWorks, Inc., Natick, MA, USA). Furtherly, a new index based on texture, named texture index (TI), was calculated by any two textures selected from four GLCM-based textures. At present, this method is rarely reported. There are many combinations of four texture features from five bands. By analyzing the correlation with yield, a total of six TIs were calculated according to the top six texture combinations. The calculation formula is as follows:
T I ( T 1 , T 2 ) = T 1 T 2 T 1 + T 2
where T1 and T2 present four GLCM-based textures (HOM, CON, ENE, COR) calculated from B, G, R, RE, NIR, and TI(T1, T2) presents the TI calculated by T1 and T2.

2.4. Data Sets Construction and Model Effect Evaluation

All data from 120 samples in 2019 and 112 samples in 2020 were mixed together to obtain a comprehensive dataset that ensured the robustness of the model. To evaluate the yield prediction performance in different growth stages, the mixed dataset was divided into two sub-data sets based on the heading stage (HS) and maturity stage (MS). 75% of the data from the two stages were randomly taken to form the HS and MS training datasets separately. Similarly, the remaining 25% of the data were used as test dataset to verify the model. The training data sets which were formed by two stages mixed data was called all stage (AS). The dataset composition and segmentation processes are shown in Figure 2, and the research flow chart for developing a novel rice yield index is shown in Figure 3.
The quality of data distribution has an important impact on the construction of a prediction model. Rice yield in the training dataset ranged from 3820.5 to 13,960.8 kg/ha (coefficient of variation, CV = 28.93%) (Table 3). For the test dataset, rice yield ranged from 4558.6 to 14,117.4 kg/ha (CV = 29.26%). The statistical analysis results of the dataset show that the test dataset was more variable than the training dataset. It can well verify the model performance. Due to contained conventional rice experimental plots and high-yield rice experimental plots, the two-year rice experiments offered a suitable dataset with large variability for evaluating the yield prediction model.
The training dataset was trained by three methods, including quadratic nonlinear regression (QNR), multiple linear regression (MLR), and random forest (RF). These three models include the nonlinear model, linear model, and machine learning model. The regression model formulas of quadratic nonlinear regression (QNR) and multiple linear regression (MLR) are as follows:
y = a x 2 + b x + c
y = i n a i x i + a 0
where x and xi represent the input variables (e.g., VI, CI and TI), y is the predicted value of rice yield, and a, b, a0 and ai are the coefficients which were calculated through the training model in MATLAB 2018b and IBM SPSS Statistics 25 (IBM Corporation, Armonk, NY, USA).
Due to the introduction of two randomness, RF is not easy to fall into overfitting, and has good performance on the dataset. The number of decision trees (ntree) and the number of observations per tree leaf (mtry) are two important parameters in the RF model. When ntree is set to a large enough value, it mainly affects the running time of the model, not modeling accuracy. According to previous studies, ntree is usually set to 500. The setting of mtry greatly affects the accuracy of the RF model, so it needs to be adjusted according to the input variables. In this paper, a 10-fold cross-validation method was used to optimize the model. The average error of 10 iterations of modeling and validation is taken as the error of 10-fold cross-validation.
The coefficient of determination (R2), mean absolute error (MAE), and mean absolute percentage error (MAPE) were selected to evaluate the performance of the model. The calculation formulas are as follows:
R 2 = i n ( P i T ¯ ) 2 i n ( T i T ¯ ) 2
M A E = 1 n i = 1 n | T i P i |
M A P E = 1 n i = 1 n | T i P i | T ¯ · 100 %
where Pi and Ti represent the prediction value and truth value of sample rice yield respectively, T ¯ represents the average value of truth rice yield, and n is the number of samples in the dataset.

3. Results

3.1. Relationships between Yield and New VIs and CIs

The reflectivity of R, G, B, RE and NIR bands in the rice canopy of each plot was obtained by two-year rice experiments. By analyzing the correlation coefficient between rice yield and the indices calculated by any two bands from five bands, the optimal VIs was selected to enhance modeling (Figure 4 and Figure 5). The best band combinations of six VIs at HS are VI1(G-R, B-G) (Pearson correlation coefficient, R = 0.5032, VI1(G-R, B-G) represents the VI calculated by the combination of green, red and blue bands, and so on for other VIs), VI2(R, R-RE) (R = 0.6390), VI3(R, G) (R = 0.6882), VI4(R,G) (R = 0.7036), VI5(RE,R) (R = 0.4602) and VI6(NIR, B) (R = 0.4903) (Figure 4). The best band combinations of six VIs at MS were VI1(RE-NIR, B-R) (R = 0.7742), VI2(R, B) (R = 0.5921), VI3(NIR, R) (R = 0.6741), VI4(NIR, B) (R = 0.6432), VI5(R, B) (R = 0.8423), VI6(NIR, R) (R = 0.6402) (Figure 5). Qian [31] studied the correlation between NDVI, ratio vegetation index (RVI) and rice yield at different stages (RNDVI = 0.24, RRVI = 0.25 at HS, RNDVI = 0.72, RRVI = 0.71 at MS). Rahman [32] showed the correlation between temperature condition index (TCI) and rice yield (RTCI = 0.5 at MS). Compared with the results of previous studies, the method of reconstructing VIs does improve the correlation between VIs and yield. As well, the results show that the best combination of these indices is not the original common combination. Figure 6a,b shows the relationships between yield and VIs, CIs based on QNR. The best performance of VI in yield prediction at HS, MS and AS were VI4 (R2 = 0.62), VI5 (R2 = 0.68), VI1 (R2 = 0.54), respectively. while at these three stages the best performance of CI was CI2 (R2 = 0.48), CI3 (R2 = 0.59), CI1 (R2 = 0.43). The performance of VI was significantly better than that of CI, and the prediction effect at MS was the best, slightly worse at the HS, and the worst at AS. The relationship between the VI5 and yield, as shown in Figure 6, supported this conclusion. When single VI5 from HS or MS was used as the input variable, the coefficients of determination were R2 = 0.58 (significant at the 0.05 level), R2 = 0.68 (significant at the 0.01 level) respectively, while the accuracy of modeling at AS decreased (R2 = 0.53, significant at the 0.05 level). As well, the same result could be observed in CI.

3.2. Relationships between Yield and New TIs

Figure 6c,d shows the correlation between yield and GLCM-based textures of six bands. The performance of texture at HS was better than that at MS, and texture from the red band outperformed all other bands, followed by the NIR band texture. CONred (CON from the red band, and so on for other textures) performed best at MS (R2 = 0.52) and AS (R2 = 0.41), and ENENIR performed best at HS (R2 = 0.43). Other textures such as CONNIR, CORred, HOMNIR, and ENEred also outperformed most others. Hlatshwayo et al. [14] showed that the coefficient of determination of raw band texture for aboveground was 0.51. The results show that the performance of the original GLCM-based texture used to predict yield needs to be improved. The performance of texture was better than that of CI, but worse than that of VI. It was considered that the key bands which could generally indicate the nutritional, and growth status of rice were red and NIR bands. Therefore, VI and TI including red and NIR bands had a good correlation with yield. In addition, the number of panicles was closely related to yield, and the distribution density of panicles affected texture [18]. So texture also had an acceptable performance in yield prediction (significant at 0.05 level).
Moreover, six newly constructed TIs, calculated by the top six combinations (Table 4), were evaluated with the QNR method. It is found that the performance of TIs was much better than the original texture. The top TIs for yield prediction were TI(CONNIR, CORred) (TI calculated by the combination of CONNIR and CORred, and so on for other TIs) at HS, TI(CONNIR, CONred) at MS, and TI(CONNIR, CORred) at AS.

3.3. YI Building

3.3.1. YI Building by QNR Model

The YI for yield estimation was constructed using the top VIs, CIs and TIs at two different growth stages. The results in Table 5 show that the YI based on VI, CI, and TI had obtained acceptable verification results, and the performance of YI based on VI was better than that based on CI and TI. The top YI based on VI at HS, MS and AS were YIVI4 (R2 = 0.62, MAPE = 17.93%, YIVI4 represents the YI calculated by VI4, and so on for other YI), YIVI5 (R2 = 0.67, MAPE = 13.98%) and YIVI1 (R2 = 0.54, MAPE = 24.33%), respectively (Table 5). The top YI based on CI at three stages were YICI2 (R2 = 0.48, MAPE = 27.25%), YICI3(R2 = 0.59, MAPE = 22.21%) and YICI1 (R2 = 0.43, MAPE = 34.52%). YI(CONNIR-ENEred) (R2 = 0.52, MAPE = 18.60%, YI(CONNIR-ENEred) represents the YI built by TI(CONNIR-ENEred), and so on for other YI), YI(CONred-ENENIR) (R2 = 0.64, MAPE = 25.89%) and YI(CONNIR-CONred) (R2 = 0.46, MAPE = 30.23%) were the top YIs based on TI at three stages respectively. The performance of VI-based YI (MAPE = 13.98%) at MS was better than other stages, and outperform that of CI-based (MAPE = 22.21%) and TI-based (MAPE = 18.60%).

3.3.2. YI Building Based on Fusing VIs, CIs and TIs

The six VIs, six CIs, and six TIs mentioned above were fused to build YI by MLR and RF models. VIs and CIs, which are calculated by different bands, are spectral type data. TIs calculated by different GLCM-based texture is the data indicating arrangement regularity of rice canopy. Thus, input variables were divided into three kinds: (1) six VIs, (2) six VIs + six CIs, and (3) six VIs + six CIs + six TIs. The purpose of adding VIs, CIs, and TIs to the models step by step is to explore the influence of three indices on the accuracy of yield prediction. The YIs built by MLR and RF were validated using the test dataset (Table 6). When the input variables were a fusion of six VIs, the performance of YI from MLR at three stages was slightly better than that from QNR. However, when YI was built by fusion of VIs + CIs and VIs + CIs + TIs, the MAPE of yield prediction results increased. It may be that too many inputs led to a complex model and worse prediction performance. The YI based on VIs achieved the highest accuracy (MAPE = 12.60%) at MS, and its performance was better than the MLR model based on VIs + CIs (MAPE = 13.15%) and VIs + CIs + TIs (MAPE = 13.7%).
According to previous studies, ntree was usually set to 500. The value of mtry was adjusted (1 to 20) to minimize the MAPE of the validation dataset for obtaining a general optimization RF model. When VIs + CIs + TIs was used as the input variable to build the RF model, the MAPE of the validation dataset at HS decreased and then increased, and the minimum MAPE was obtained at mtry = 7. The MAPE of the validation dataset at MS decreased and then kept steady, and the minimum MAPE was obtained at mtry = 12. The MAPE of the validation dataset at AS decreased, then increased, and finally kept steady, and the minimum MAPE was obtained at mtry = 6. When VIs and VIs + CIs were used as input variables to build the model, the value of mtry was also adjusted, and the results are shown in Table 7.
Different from YI built by MLR, the verification of YI built by RF shows the decreasing trend of MAPE due to the increase of input variables. Compared with other combinations, there was a minimum MAPE at three stages based on the fusion of VIs + CIs + TIs. The best YI constructed by a combination of VIs + CIs + TIs from RF was superior to QNR and MLR, and the highest accuracy was obtained for yield prediction at MS (MAPE = 7.8%) (Figure 7). As an example, the visualization of yield prediction verification results, as predicted by the YI based on VIs, CIs, and TIs from the RF model using UAV imagery of experiment 5 at Fuyang station, is presented in three stages separately (Figure 8). The difference in yield prediction accuracy is obvious in the response to the YI constructed by different methods in different periods.

4. Discussion

4.1. Model Performance Using Cross Datasets

To have better generalization ability, the models were trained on the datasets of 2019 and 2020. The dataset used in the validation of the model contains experimental data of 24 varieties and 21 fertilization patterns at different growth stages. During the validation, we did not retrain the model for representing the actual challenge to yield prediction. Satisfactory cross dataset validation result (R2 = 0.73, MAE = 1375.46 kg/ha and MAPE = 7.86% at MS) shows that our proposed model can overcome the problems caused by different varieties and fertilization patterns.
Although the yield prediction accuracy is not satisfactory at AS, it could be improved by the division of different growth stages. The performance of MS is better than HS (R2 = 0.84, MAE = 1375.46 kg/ha and MAPE = 7.86%), which shows that MS is the best time to estimate yield. Panicles, which are an important part to determine the yield, were green and remained upright at HS. The upright panicles appeared as dots in the orthophoto image, and accounted for a small proportion of pixels in the whole image. Dense rice leaves and erect panicles made it difficult to observe and extract effective features from UAV images. In the subsequent growth period, climate and agricultural management would also have a great impact on the yield. These factors led to unsatisfactory yield prediction performance at HS. The panicles turned yellow and became curved at MS, which makes it easy to distinguish panicles from leaves in the orthophoto image of the rice canopy. Curved panicles can occupy a higher proportion of pixels in the entire image. Clear and accurate information on panicle size and distribution density at MS was helpful to improve the prediction accuracy. At all stages, rice canopy spectra and textures have been changing dynamically with rice growth, which will lead to poor prediction performance. However, some growth stages are short and the boundaries between them are not obvious, which brings challenges to the prediction of this method.

4.2. Performance of the Novel YI

In this study, YI built by RF based on six VIs + CIs + TIs obtained the best performance at MS (R2 = 0.84, MAE = 1375.46 kg/ha, MAPE = 7.86%). Naito et al. [18] proved that the simple ratio index exhibited the best performance in regards to the estimation of grain weight (R2 = 0.80). The top single VIs for yield prediction are YIVI4 (R2 = 0.62, HS) and YIVI5 (R2 = 0.67, MS) (Table 5). The best performances of YIs built by MLR and RF are R2 = 0.69 (HS) and R2 = 0.0.72 (MS), R2 = 0.84 (HS) and R2 = 0.79 (MS) (Table 6). Compared with single VI, the YI proposed by us can integrate more spectral and image characteristics of crops to improve the performance of yield prediction. Kang et al. [33] proposed an artificial neural network rice yield prediction model based on six VIs, which obtained a stable prediction accuracy (R2 ≥ 0.71, RMSE ≤ 29.0 kg/1000 m2) under one variety and six nitrogen treatments. Three of the six indices in his study are in the same form as the indices we used. Wang et al. [34] showed that introducing fluorescence spectral information at the flowering stage into conventional VIs-based yield estimation models is helpful in improving rice yield estimation accuracy (R2 = 0.869, MAPE = 3.98%, RMSE = 396.02 kg/ha) under four varieties and five nitrogen treatments. In these studies, the prediction model accuracy of rice yield is better than ours. However, their proposed model was only validated on rice research materials with a few varieties and several different nitrogen gradient treatments in a single year. The YI proposed in our study was built on RF by fusing the spectral and image information of 24 rice varieties under 21 fertilization modes in different regions in two years. When applied to the test dataset, the performance of YI can have good stability for yield prediction.

4.3. Advantages of the Novel YI

VI could help to enhance the interpretation of remote sensing data and has been widely used as a way of processing remote sensing information [35]. The top three VIs at HS are VI4(R, G) (R = 0.7036), VI3(R, G) (R = 0.6882) and VI2(R, R-RE) (R = 0.6390) (Figure 4). The top three VIs at MS are VI5(R, B) (R = 0.8423), VI1(RE-NIR, B-R) (R = 0.7742) and VI3(NIR, R) (R = 0. 6741) (Figure 5). The new VIs calculated by canopy spectrum (especially red band, red edge band, and near-infrared band) can well monitor the rice yield with high correlation. Leaf color indicates the nutritional status of rice, and the number of flowers and ears is closely related to yield [5]. The CIs extracted from RGB images can sensitively perceive the color changes of leaves and ears during rice heading (CI2, R2 = 0.48 at HS) and ripening (CI3, R2 = 0.59 at MS). Four texture features of each gray image were creatively constructed as the normalized TIs. Compared with the top original texture feature (CONred, R2 = 0.52 at MS), TI (TI(CONNIR, CONred), R2 = 0.61 at MS) can better represent the changes in rice yield which may be affected by distribution density and size of rice ears. Therefore, the YI built by fusing rice canopy spectrum, color, and texture has better robustness than the traditional remote sensing data processing methods in cross varieties, regions, and fertilizer treatments data.

4.4. Potential Applications of the Novel YI

It has been verified that the new YI developed by fusing VIs, CIs, and TIs on RF (MAPE = 7.86%) can further reduce MAPE by 1.89% and 6.12% than that on MLR (9.72%) and QNR (13.98%). It may help to better implement crop management, especially in crop insurance, harvest plan, storage demand, cash flow budget, and determination of input decisions such as nutrition, pesticide, and water. Rice yield is the most important parameter and ultimate target index in rice planting and breeding research. The new yield index is helpful to the breeding research of new rice varieties with high yield and good stress resistance. With the rapid development of breeding technology, hundreds of new breeding materials need to be processed key growth stage. Using yield index can realize digital technology with a high degree of automation, high precision, and high speed, and break through the bottleneck of traditional rice yield phenotype measurement.

5. Conclusions

This study developed a novel rice yield index using UAV remote sensing imagery fusion technology. Our findings suggest that the yield index (YI) built by RF based on fusing vegetation indices (VIs), color indices (CIs), and texture indices (TIs) extracted from unmanned aerial vehicle (UAV) imagery has great potential in rice yield monitoring. The relationships between yield and VIs, CIs, and TIs were calculated to select input variables for building YI. The results show that the performance of the newly constructed VIs and TIs was better than that of the original specific bands and single texture. VI4(R, G) and VI5(R, B) exhibited the highest correlation to yield at MS and HS, respectively. The performance of VI-based YI built by QNR at MS was better than other stages, and outperformed that of CI-based and TI-based. Compared with HS, the best performance of YI was developed by RF with combining VIs + CIs + TIs at MS (R2 = 0.84, MAE = 714.55 kg/ha, MAPE = 7.86%). This research proved the feasibility of YI from UAV-based imagery in yield prediction. Further improvements are needed in modeling methods and optimal height of data collection for both efficiency and accuracy to provide accurate data for precision agricultural management and yield phenotyping measurement.

Author Contributions

Conceptualization, F.L. and J.Z.; methodology, F.L., J.Z. and X.L.; software, J.Z. and X.L.; validation, J.Z., X.L. and R.Y.; investigation, J.Z. and X.L.; resources, F.L., H.C., Y.W. and Y.Z.; data curation, F.L., H.C., Y.W. and Y.Z.; writing—original draft preparation, J.Z. and F.L.; writing—review and editing, J.Z., J.H. and F.L.; visualization, J.Z., R.Y., X.L. and J.H.; project administration, F.L., H.C. and Y.W. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by Science and Technology Department of Zhejiang Province [2020C02016] and Collaborative Extension Program of Major Agricultural Technologies of Zhejiang Province of China [2021XTTGLY0104].

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data that support the findings of this study are available from the corresponding author upon reasonable request.

Conflicts of Interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

References

  1. Ouattara, T.A.; Sokeng, V.C.J.; Zo-Bi, I.C.; Kouame, K.F.; Grinand, C.; Vaudry, R. Detection of forest tree losses in cote d’ivoire using drone aerial images. Drones 2022, 6, 83. [Google Scholar] [CrossRef]
  2. Padua, L.; Antao-Geraldes, A.M.; Sousa, J.J.; Rodrigues, M.A.; Oliveira, V.; Santos, D.; Miguens, M.F.P.; Castro, J.P. Water hyacinth (Eichhornia crassipes) detection using coarse and high resolution multispectral data. Drones 2022, 6, 47. [Google Scholar] [CrossRef]
  3. Dalla Corte, A.P.; Neto, E.D.M.; Rex, F.E.; Souza, D.; Behling, A.; Mohan, M.; Sanquetta, M.N.I.; Silva, C.A.; Klauberg, C.; Sanquetta, C.R.; et al. High-density UAV-lidar in an integrated crop-livestock-forest system: Sampling forest inventory or forest inventory based on individual tree detection (ITD). Drones 2022, 6, 48. [Google Scholar] [CrossRef]
  4. Cao, Q.; Miao, Y.X.; Feng, G.H.; Gao, X.W.; Li, F.; Liu, B.; Yue, S.C.; Cheng, S.S.; Ustin, S.L.; Khosla, R. Active canopy sensing of winter wheat nitrogen status: An evaluation of two sensor systems. Comput. Electron. Agric. 2015, 112, 54–67. [Google Scholar] [CrossRef]
  5. Gnyp, M.L.; Miao, Y.X.; Yuan, F.; Ustin, S.L.; Yu, K.; Yao, Y.K.; Huang, S.Y.; Bareth, G. Hyperspectral canopy sensing of paddy rice aboveground biomass at different growth stages. Field Crops Res. 2014, 155, 42–55. [Google Scholar] [CrossRef]
  6. Khaliq, A.; Comba, L.; Biglia, A.; Aimonino, D.R.; Chiaberge, M.; Gay, P. Comparison of satellite and UAV-based multispectral imagery for vineyard variability assessment. Remote Sens. 2019, 11, 436. [Google Scholar] [CrossRef] [Green Version]
  7. Gu, Z.J.; Ju, W.M.; Li, L.; Li, D.Q.; Liu, Y.B.; Fan, W.L. Using vegetation indices and texture measures to estimate vegetation fractional coverage (VFC) of planted and natural forests in Nanjing city, China. Adv. Space Res. 2013, 51, 1186–1194. [Google Scholar] [CrossRef]
  8. Zhang, J.Y.; Liu, X.; Liang, Y.; Cao, Q.; Tian, Y.C.; Zhu, Y.; Cao, W.X.; Liu, X.J. Using a portable active sensor to monitor growth parameters and predict grain yield of winter wheat. Sensors 2019, 19, 1108. [Google Scholar] [CrossRef] [Green Version]
  9. Li, W.; Niu, Z.; Chen, H.Y.; Li, D.; Wu, M.Q.; Zhao, W. Remote estimation of canopy height and aboveground biomass of maize using high-resolution stereo images from a low-cost unmanned aerial vehicle system. Ecol. Indic. 2016, 67, 637–648. [Google Scholar] [CrossRef]
  10. Li, B.; Xu, X.M.; Zhang, L.; Han, J.W.; Bian, C.S.; Li, G.C.; Liu, J.G.; Jin, L.P. Above-ground biomass estimation and yield prediction in potato by using UAV-based RGB and hyperspectral imaging. ISPRS J Photogramm. Remote Sens. 2020, 162, 161–172. [Google Scholar] [CrossRef]
  11. Wan, L.; Li, Y.J.; Cen, H.Y.; Zhu, J.P.; Yin, W.X.; Wu, W.K.; Zhu, H.Y.; Sun, D.W.; Zhou, W.J.; He, Y. Combining UAV-based vegetation indices and image classification to estimate flower number in oilseed rape. Remote Sens. 2018, 10, 1484. [Google Scholar] [CrossRef] [Green Version]
  12. Maimaitijiang, M.; Sagan, V.; Sidike, P.; Hartling, S.; Esposito, F.; Fritschi, F.B. Soybean yield prediction from UAV using multimodal data fusion and deep learning. Remote Sens. Environ. 2020, 237, 111599. [Google Scholar] [CrossRef]
  13. Ge, H.X.; Ma, F.; Li, Z.W.; Du, C.W. Grain yield estimation in rice breeding using phenological data and vegetation indices derived from UAV images. Agronomy 2021, 11, 2439. [Google Scholar] [CrossRef]
  14. Hlatshwayo, S.T.; Mutanga, O.; Lottering, R.T.; Kiala, Z.; Ismail, R. Mapping forest aboveground biomass in the reforested Buffelsdraai landfill site using texture combinations computed from SPOT-6 pan-sharpened imagery. Int. J. Appl. Earth Obs. Geoinf. 2019, 74, 65–77. [Google Scholar] [CrossRef]
  15. Lu, D. Aboveground biomass estimation using Landsat TM data in the Brazilian Amazon. Int. J. Remote Sens. 2005, 26, 2509–2525. [Google Scholar] [CrossRef]
  16. Yang, M.D.; Huang, K.S.; Kuo, Y.H.; Tsai, H.P.; Lin, L.M. Spatial and spectral hybrid image classification for rice lodging assessment through uav imagery. Remote Sens 2017, 9, 583. [Google Scholar] [CrossRef] [Green Version]
  17. Zhou, X.; Zheng, H.B.; Xu, X.Q.; He, J.Y.; Ge, X.K.; Yao, X.; Cheng, T.; Zhu, Y.; Cao, W.X.; Tian, Y.C. Predicting grain yield in rice using multi-temporal vegetation indices from UAV-based multispectral and digital imagery. ISPRS J. Photogramm. Remote Sens. 2017, 130, 246–255. [Google Scholar] [CrossRef]
  18. Naito, H.; Ogawa, S.; Valencia, M.O.; Mohri, H.; Urano, Y.; Hosoi, F.; Shimizu, Y.; Chavez, A.L.; Ishitani, M.; Selvaraj, M.G.; et al. Estimating rice yield related traits and quantitative trait loci analysis under different nitrogen treatments using a simple tower-based field phenotyping system with modified single-lens reflex cameras. ISPRS J Photogramm. Remote Sens. 2017, 125, 50–62. [Google Scholar] [CrossRef]
  19. Zhang, K.; Ge, X.K.; Shen, P.C.; Li, W.Y.; Liu, X.J.; Cao, Q.; Zhu, Y.; Cao, W.X.; Tian, Y.C. Predicting rice grain yield based on dynamic changes in vegetation indexes during early to mid-growth stages. Remote Sens. 2019, 11, 387. [Google Scholar] [CrossRef] [Green Version]
  20. Elmetwalli, A.H.; El-Hendawy, S.; Al-Suhaibani, N.; Alotaibi, M.; Tahir, M.U.; Mubushar, M.; Hassan, W.M.; Elsayed, S. Potential of hyperspectral and thermal proximal sensing for estimating growth performance and yield of soybean exposed to different drip irrigation regimes under arid conditions. Sensors 2020, 20, 6569. [Google Scholar] [CrossRef]
  21. Eckert, S. Improved forest biomass and carbon estimations using texture measures from worldview-2 satellite data. Remote Sens. 2012, 4, 810–829. [Google Scholar] [CrossRef] [Green Version]
  22. Vina, A.; Gitelson, A.A.; Nguy-Robertson, A.L.; Peng, Y. Comparison of different vegetation indices for the remote assessment of green leaf area index of crops. Remote Sens. Environ. 2011, 115, 3468–3478. [Google Scholar] [CrossRef]
  23. Jordan, C.F. Derivation of leaf-area index from quality of light on forest floor. Ecology 1969, 50, 663–666. [Google Scholar] [CrossRef]
  24. Atzberger, C.; Guerif, M.; Baret, F.; Werner, W. Comparative analysis of three chemometric techniques for the spectroradiometric assessment of canopy chlorophyll content in winter wheat. Comput. Electron. Agric. 2010, 73, 165–173. [Google Scholar] [CrossRef]
  25. Huang, W.J.; Guan, Q.S.; Luo, J.H.; Zhang, J.C.; Zhao, J.L.; Liang, D.; Huang, L.S.; Zhang, D.Y. New optimized spectral indices for identifying and monitoring winter wheat diseases. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2014, 7, 2516–2524. [Google Scholar] [CrossRef]
  26. Huete, A.R. A soil-adjusted vegetation index (SAVI). Remote Sens. Environ. 1988, 25, 295–309. [Google Scholar] [CrossRef]
  27. Meyer, G.E.; Neto, J.C. Verification of color vegetation indices for automated crop imaging applications. Comput. Electron. Agric. 2008, 63, 282–293. [Google Scholar] [CrossRef]
  28. Bendig, J.; Yu, K.; Aasen, H.; Bolten, A.; Bennertz, S.; Broscheit, J.; Gnyp, M.L.; Bareth, G. Combining UAV-based plant height from crop surface models, visible, and near infrared vegetation indices for biomass monitoring in barley. Int. J. Appl. Earth Obs. Geoinf. 2015, 39, 79–87. [Google Scholar] [CrossRef]
  29. Gitelson, A.A.; Kaufman, Y.J.; Stark, R.; Rundquist, D. Novel algorithms for remote estimation of vegetation fraction. Remote Sens. Environ. 2002, 80, 76–87. [Google Scholar] [CrossRef] [Green Version]
  30. Zhang, J.Y.; Qiu, X.L.; Wu, Y.T.; Zhu, Y.; Cao, Q.; Liu, X.J.; Cao, W.X. Combining texture, color, and vegetation indices from fixed-wing uas imagery to estimate wheat growth parameters using multivariate regression methods. Comput. Electron. Agric. 2021, 185, 1061838. [Google Scholar] [CrossRef]
  31. Qian, C.; Wu, X.J. Mapping paddy rice yield in zhejiang province using MODIS spectral index. Turk. J. Agric. For. 2011, 35, 579–589. [Google Scholar] [CrossRef]
  32. Rahman, A.; Roytman, L.; Krakauer, N.Y.; Nizamuddin, M.; Goldberg, M. Use of vegetation health data for estimation of aus rice yield in bangladesh. Sensors 2009, 9, 2968–2975. [Google Scholar] [CrossRef] [PubMed]
  33. Kang, Y.; Nam, J.; Kim, Y.; Lee, S.; Seong, D.; Jang, S.; Ryu, C. Assessment of regression models for predicting rice yield and protein content using unmanned aerial vehicle-based multispectral imagery. Remote Sens. 2021, 13, 1508. [Google Scholar] [CrossRef]
  34. Wang, F.M.; Yao, X.P.; Xie, L.L.; Zheng, J.Y.; Xu, T.Y. Rice yield estimation based on vegetation index and florescence spectral information from UAV hyperspectral remote sensing. Remote Sens. 2021, 13, 3390. [Google Scholar] [CrossRef]
  35. Zheng, H.B.; Cheng, T.; Li, D.; Zhou, X.; Yao, X.; Tian, Y.C.; Cao, W.X.; Zhu, Y. Evaluation of rgb, color-infrared and multispectral images acquired from unmanned aerial systems for the estimation of nitrogen accumulation in rice. Remote Sens. 2018, 10, 824. [Google Scholar] [CrossRef] [Green Version]
Figure 1. The general locations and orthophoto images of six experimental fields:(a) Experiment 1 (2019); (b) Experiment 2 (2019); (c) Experiment 3 (2020); (d) Experiment 4 (2020); (e) Experiment 5 (2020); (f) Experiment 6 (2020). Note: red dot and dotted lines represents that the data of six experiments were collected in Hangzhou.
Figure 1. The general locations and orthophoto images of six experimental fields:(a) Experiment 1 (2019); (b) Experiment 2 (2019); (c) Experiment 3 (2020); (d) Experiment 4 (2020); (e) Experiment 5 (2020); (f) Experiment 6 (2020). Note: red dot and dotted lines represents that the data of six experiments were collected in Hangzhou.
Drones 06 00151 g001
Figure 2. Flow chart of dataset segmentation method. Note: The purple lines and green lines represent the construction process of training dataset and testing dataset. The blue lines represent the YI built by models.
Figure 2. Flow chart of dataset segmentation method. Note: The purple lines and green lines represent the construction process of training dataset and testing dataset. The blue lines represent the YI built by models.
Drones 06 00151 g002
Figure 3. Research flow chart for developing novel rice yield index.
Figure 3. Research flow chart for developing novel rice yield index.
Drones 06 00151 g003
Figure 4. Correlation analysis between yield and VI of different band combinations at HS: (a) VI1; (b) VI2; (c) VI3; (d) VI4; (e) VI5; (f) VI6.
Figure 4. Correlation analysis between yield and VI of different band combinations at HS: (a) VI1; (b) VI2; (c) VI3; (d) VI4; (e) VI5; (f) VI6.
Drones 06 00151 g004
Figure 5. Correlation analysis between yield and VI of different band combinations at MS: (a) VI1; (b) VI2; (c) VI3; (d) VI4; (e) VI5; (f) VI6.
Figure 5. Correlation analysis between yield and VI of different band combinations at MS: (a) VI1; (b) VI2; (c) VI3; (d) VI4; (e) VI5; (f) VI6.
Drones 06 00151 g005
Figure 6. Coefficient of determination (R2) between yield and VIs, CIs and texture: (a) R2 between VI and yield; (b) R2 between CI and yield; (c) R2 between texture and yield at HS; (d) R2 between CI and yield at MS. (* means significant at the 0.05 level, ** means significant at the 0.01 level, and no symbol means no significance.)
Figure 6. Coefficient of determination (R2) between yield and VIs, CIs and texture: (a) R2 between VI and yield; (b) R2 between CI and yield; (c) R2 between texture and yield at HS; (d) R2 between CI and yield at MS. (* means significant at the 0.05 level, ** means significant at the 0.01 level, and no symbol means no significance.)
Drones 06 00151 g006
Figure 7. The yield prediction validation results of YI: (a) built by QNR at HS; (b) built by QNR at MS; (c) built by MLR at HS; (d) built by MLR at MS; (e) built by RF at HS; (f) built by RF at MS. Note: The red lines represent the fitting lines between predicted yield and true yield. The blue dots and squares represent predicted yield and true yield of samples from testing dataset at HS and MS.
Figure 7. The yield prediction validation results of YI: (a) built by QNR at HS; (b) built by QNR at MS; (c) built by MLR at HS; (d) built by MLR at MS; (e) built by RF at HS; (f) built by RF at MS. Note: The red lines represent the fitting lines between predicted yield and true yield. The blue dots and squares represent predicted yield and true yield of samples from testing dataset at HS and MS.
Drones 06 00151 g007
Figure 8. An example for visualization of yield prediction verification results of experiment 5 by YI based on VIs, CIs, and TIs from RF model at Fuyang Station: (a) yield prediction result at HS; (b) yield prediction result at MS; (c) yield prediction result at AS.
Figure 8. An example for visualization of yield prediction verification results of experiment 5 by YI based on VIs, CIs, and TIs from RF model at Fuyang Station: (a) yield prediction result at HS; (b) yield prediction result at MS; (c) yield prediction result at AS.
Drones 06 00151 g008
Table 1. Information on the field experiments. Note: Exp.1 represents experiment 1, and so on for others; N represents nitrogen fertilizer; BF, PF, and TF represent base fertilizer, panicle fertilizer, and tillering fertilizer respectively.
Table 1. Information on the field experiments. Note: Exp.1 represents experiment 1, and so on for others; N represents nitrogen fertilizer; BF, PF, and TF represent base fertilizer, panicle fertilizer, and tillering fertilizer respectively.
ExperimentsRice CultivarsFertilization Treatments (N Treatments)Sampling Date (Stages)
2019 Exp.1 (Fuyang)
24 plots
4 varieties:
Ezao 18; Zhuliangyou 189; Zhuliangyou 819; Liangyou 287
(1) Blank control group; (2) N 150 kg·ha−1, BF:PF:TF = 5:3:2.26 August (heading), 24 October (maturity)
2019 Exp.2
(Fuyang)
96 plots
2 varieties:
Zhongzao 39; Zhongjiazao 17
(1) Blank control group; (2) ordinary N 180 kg·ha−1, BF:PF:TF = 5:3:2; (3) special N 180 kg·ha−1, BF:PF:TF = 8:1:1; (4) special N 150 kg·ha−1, BF:PF:TF = 8:1:1; (5) slow-release N 150 kg·ha−1, BF:PF:TF = 8:1:1; (6) slow-release N 150 kg·ha−1, single basal fertilization; (7)N 150 kg·ha−1 in 10 different brands of fertilizers, single basal fertilization26 August (heading), 24 October (maturity)
2020 Exp.3 (Pingyao)
11 plots
1 variety:
Yongyou 1540
(1) side deep slow-release N 160 kg·ha−1, BF:PF = 5:5; (2) side deep slow-release N 160 kg·ha−1, BF:PF = 8:3; (3) side deep slow-release N 160 kg·ha−1, BF:PF = 3:8; (4) N 160 kg·ha−1, single basal fertilization.24 August (heading), 10 October (maturity)
2020 Exp.4
(Yuhang)
19 plots
19 varieties:
Yongyou 1540; Yongyou 7850; Yongyou 7860; Yongyou 7872; Yongyou 6711; Chunyou 801; Chengyou 13; Zhejiang Jingyou 1578; Xiuyou 4913; Xiuyou 71,207; Jiaheyou 5; Xiushui 134; Jia 67; Zhongjia 8; Zhejiang Jing 99; Zhejiang Jing 100; Zhejiang Hujing 25; Chunjiang 157;
Wankenjing 11,036
(1) N 207 kg·ha−1, BF:PF:TF = 3:3:4.24 August (heading), 31 October (maturity)
2022 Exp.5 (Fuyang)
28 plots
4 varieties:
Yongyou 1540; Yongyou 17; Zhongzheyou 8; Xiushui 134
(1) Blank control group; (2) N 160 kg·ha−1, BF:PF:TF = 4:7:5; (3) N 210 kg·ha−1, BF:PF:TF = 6:10:5; (4) N 220 kg·ha−1, BF:PF: TF = 4:7:11; (5) N 260 kg·ha−1, BF:PF:TF = 8:13:5; (6) N 270 kg·ha−1, BF:PF:TF = 6:10:11; (7) N 320 kg·ha−1, BF:PF:TF = 8:13:11.17 August (heading), 30 October (maturity)
2022 Exp.6 (Fuyang)
54 plots
1 variety:
Yongyou 12
(1) Blank control group;(2) N 0 kg·ha−1, with plastic film; (3) N 195 kg·ha−1, without plastic film; (4) N 165 kg·ha−1, with biodegradable membrane; (5) N 195 kg·ha−1, with biodegradable membrane.17 August (heading), 30 October (maturity)
Table 2. Six VIs and six CIs selected in this study. Rλ1, Rλ2, Rλ3, and Rλ4 represent the reflectance of any bands selected from R, G, B, RE, and NIR.
Table 2. Six VIs and six CIs selected in this study. Rλ1, Rλ2, Rλ3, and Rλ4 represent the reflectance of any bands selected from R, G, B, RE, and NIR.
Index NameFormulaReference
VI1(Rλ1Rλ2)/(Rλ3Rλ4) [22]
VI2Rλ1Rλ2 [23]
VI3(Rλ1Rλ2))/(Rλ1 + Rλ2) [24]
VI4Rλ1/Rλ2 [25]
VI41.5*(Rλ1Rλ2))/(Rλ1 + Rλ2 + 0.5) [26]
VI51.16*(Rλ1Rλ2)/(Rλ1 + Rλ2 + 0.16) [11]
CI12gbr [27]
CI2(g2r2)/(g2 + r2) [28]
CI3(g2br)/(g2 + br) [28]
CI4(rg)/(r + gb) [29]
CI53g − 2.4rb [27]
CI6(2gbr)/(2g + b + r) [30]
Table 3. Statistical analysis of rice yield data. Note: SD represents standard deviation, and CV represents the coefficient of variation.
Table 3. Statistical analysis of rice yield data. Note: SD represents standard deviation, and CV represents the coefficient of variation.
DatasetsNumber of SamplesRange (kg/ha)Average (kg/ha)SDCV (%)
Training dataset1743820.5–13,960.88546.62472.828.93
Test dataset584558.6–14,117.48625.62524.229.26
All2323820.5–14,117.48566.42480.528.96
Table 4. Six top R2 between TI and yield. Note: * means significant at the 0.05 level, ** means significant at the 0.01 level. TI(T1, T2) is calculated by the combination of two GLCM-based textures (T1 and T2).
Table 4. Six top R2 between TI and yield. Note: * means significant at the 0.05 level, ** means significant at the 0.01 level. TI(T1, T2) is calculated by the combination of two GLCM-based textures (T1 and T2).
ModelInput TIHSMSAS
QNRTI(CONNIR, ENEred)0.58 *0.56 *0.46 *
TI(CONred, ENENIR)0.50 *0.44 *0.36 *
TI(CONNIR, CONred)0.47 *0.42 *0.30 *
TI(CONNIR, CONred)0.63 **0.61 *0.32 *
TI(CONNIR, ENEred)0.56 *0.58 *0.45 *
TI(CONNIR, CORred)0.64 **0.60 **0.48 *
Table 5. The top three verification results of YI are based on VIs, CIs, and TIs in QNR. Note: * means significant at the 0.05 level, ** means significant at the 0.01 level.
Table 5. The top three verification results of YI are based on VIs, CIs, and TIs in QNR. Note: * means significant at the 0.05 level, ** means significant at the 0.01 level.
StagesVariableR2MAE (kg/ha)MAPE (%)
YIs based on VI
HSYIVI40.62 **1247.9517.93
MSYIVI50.67 **1148.0313.98
ASYIVI10.54 *1332.7024.33
YIs based on CI
HSYICI20.48 *1498.8027.25
MSYICI30.59 *1446.2122.21
ASYICI10.43 *1562.9634.52
YIs based on TI
HSYI(CONNIR,ENEred)0.61 **1376.7918.60
MSYI(CONred,ENENIR)0.52 *1470.4025.89
ASYI(CONNIR,CONred)0.46 *1481.9730.23
Table 6. Verification results of test dataset on MLR and RF models. Note: * means significant at the 0.05 level, ** means significant at the 0.01 level.
Table 6. Verification results of test dataset on MLR and RF models. Note: * means significant at the 0.05 level, ** means significant at the 0.01 level.
StagesTechniqueR2MAE
(kg/ha)
MAPE
(%)
R2MAE
(kg/ha)
MAPE
(%)
R2MAE
(kg/ha)
MAPE
(%)
VIs VIs + CIs VIs + CIs + TIs
HSMLR0.69 **1264.5513.91 0.65 **1374.5515.12 0.61 **1690.9218.60
RF0.70 **1245.4613.70 0.79 **851.829.37 0.78 **883.649.72
MSMLR0.72 **1145.4612.60 0.71 **1195.4613.15 0.70 **1245.4613.70
RF0.73 **1095.4612.05 0.80 **748.188.23 0.84 **714.557.86
ASMLR0.54 *2211.8324.33 0.50 *2372.7426.70 0.52 *2353.6525.89
RF0.51 *2393.6526.33 0.65 **1374.5515.12 0.61 **1690.9218.60
Table 7. The optimal mtry value of RF model at three stages.
Table 7. The optimal mtry value of RF model at three stages.
StagesVIsVIs + CIsVIs + CIs + TIs
HS357
MS5412
AS468
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Zhou, J.; Lu, X.; Yang, R.; Chen, H.; Wang, Y.; Zhang, Y.; Huang, J.; Liu, F. Developing Novel Rice Yield Index Using UAV Remote Sensing Imagery Fusion Technology. Drones 2022, 6, 151. https://0-doi-org.brum.beds.ac.uk/10.3390/drones6060151

AMA Style

Zhou J, Lu X, Yang R, Chen H, Wang Y, Zhang Y, Huang J, Liu F. Developing Novel Rice Yield Index Using UAV Remote Sensing Imagery Fusion Technology. Drones. 2022; 6(6):151. https://0-doi-org.brum.beds.ac.uk/10.3390/drones6060151

Chicago/Turabian Style

Zhou, Jun, Xiangyu Lu, Rui Yang, Huizhe Chen, Yaliang Wang, Yuping Zhang, Jing Huang, and Fei Liu. 2022. "Developing Novel Rice Yield Index Using UAV Remote Sensing Imagery Fusion Technology" Drones 6, no. 6: 151. https://0-doi-org.brum.beds.ac.uk/10.3390/drones6060151

Article Metrics

Back to TopTop