Next Article in Journal
Effect of Reduced Anthropogenic Activities on Water Quality in Lake Vembanad, India
Previous Article in Journal
Estimation of Apple Flowering Frost Loss for Fruit Yield Based on Gridded Meteorological and Remote Sensing Data in Luochuan, Shaanxi Province, China
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Estimating Plant Nitrogen Concentration of Rice through Fusing Vegetation Indices and Color Moments Derived from UAV-RGB Images

1
The State Key Laboratory of Soil and Sustainable Agriculture, Institute of Soil Science Chinese Academy of Sciences, Nanjing 210008, China
2
College of Advanced Agricultural Sciences, University of Chinese Academy of Sciences, Beijing 100049, China
3
Yuan Longping High-Tech Agriculture Co., Ltd., Changsha 410001, China
*
Author to whom correspondence should be addressed.
Submission received: 15 February 2021 / Revised: 17 April 2021 / Accepted: 19 April 2021 / Published: 21 April 2021

Abstract

:
Estimating plant nitrogen concentration (PNC) has been conducted using vegetation indices (VIs) from UAV-based imagery, but color features have been rarely considered as additional variables. In this study, the VIs and color moments (color feature) were calculated from UAV-based RGB images, then partial least square regression (PLSR) and random forest regression (RF) models were established to estimate PNC through fusing VIs and color moments. The results demonstrated that the fusion of VIs and color moments as inputs yielded higher accuracies of PNC estimation compared to VIs or color moments as input; the RF models based on the combination of VIs and color moments (R2 ranging from 0.69 to 0.91 and NRMSE ranging from 0.07 to 0.13) showed similar performances to the PLSR models (R2 ranging from 0.68 to 0.87 and NRMSE ranging from 0.10 to 0.29); Among the top five important variables in the RF models, there was at least one variable which belonged to the color moments in different datasets, indicating the significant contribution of color moments in improving PNC estimation accuracy. This revealed the great potential of combination of RGB-VIs and color moments for the estimation of rice PNC.

Graphical Abstract

1. Introduction

Rice (Oryza sativa L.) is one of the most important crops in the world, feeding more than half of the world’s population [1]. Nitrogen (N) is the essential nutrient for rice growth, and is also an important limiting factor in soil productivity. Plant nitrogen concentration (PNC) has been commonly used as a crop N status indicator [2]. Timely and accurate assessment of PNC to detect N excess or deficiency is essential for farmers to improve rice production and N use efficiency [3,4]. Laboratory analysis is one of the important ways to obtain crop N nutrition status. However, it is time-consuming and laborious to carry out field investigations and collect representative samples, and usually the obtained results are delayed [5]. Compared with traditional laboratory analysis (e.g., micro Kjeldahl method), the non-destructive and timely methods or tools to detect crop N status have been significantly increased over recent decades [6,7,8]. Remote sensing with the advantages of fast and non-destructive characterizations has been proved useful for acquiring related information of crop nutrition status [9].
On regional or global scales, satellite remote sensing data was analyzed for retrieval N status [9,10]. Spectral wavebands in green, red, and red edge regions were used to calculate vegetation indices (VIs) to establish the relationship with crop N [7]. VIs and biophysical variables derived from Sentinel-2 satellite images were examined for estimation of plant N status [11]. Loozen et al. [12] used satellite-based VIs and environmental variables to map crop N in European forests. Yet today, the application of satellite-based VIs is still challenged for crop N status assessment because of the limited spatial resolution, the infrequency of satellite overpasses, and the risk of poor image data quality due to atmospheric conditions [13].
Unmanned aerial vehicle (UAV) platforms with the advantages of low cost and high spatio-temporal resolution of image data have become a promising approach in monitoring crop growth status [14,15]. Particularly, the light-weight and cost-efficient consumer-grade UAVs with RGB digital cameras can be simply and flexibly operated [16]. VIs from RGB images can distinguish crops from background soil and other interferences [17]. Moreover, color images contain a large amount of information about phenotypic traits such as biomass, plant height, and leaf senescence [18,19,20]. Several studies have proved the potential of VIs and color parameters derived from RGB images to obtain crop growth and nutrition status [17,21]. Jiang et al. [22] confirmed that the applicability of a new proposed true color vegetation index (TCVI) based on UAV-based RGB images could improve the accuracy of leaf N concentration (LNC) estimation for winter wheat under different conditions. The combination of UAV-based VIs and color parameters could yield reliable estimates of plant N density in winter wheat [23]. Rorie et al. [24] found that there was a close relationship of the dark green color index (DGCI) calculated from digital images with LNC in corn, which suggested that color image analysis was an appropriate tool for crop N nutrition status assessment at field scale. The application of VIs from RGB images to assess crop N status has been considered a promising alternative to expensive sensors such as hyperspectral and multispectral techniques; some studies have considered the fusion of texture and spectral information from UAV-based RGB images for monitoring crop growth status [4,25,26]. Besides the texture and spectral information, color feature is one of the most important features in RGB images. Among the color features, color moments are very simple and effective color features proposed by Stricker and Orengo [27]. The mathematical basis of this method is that any color distribution in the image can be represented by its moments. However, the contribution of color features for crop N estimation remains unclear.
In this study, VIs and color moments were used to estimate rice PNC based on partial least square regression (PLSR) and random forest (RF) regression algorithms [12,26,28], in which the objectives are (i) to examine the performance of VIs and color moments from UAV-based RGB images for PNC estimation in rice using algorithms of PLSR and RF; (ii) to explore the potential of improving rice PNC estimation accuracy by fusing VIs and color moments based on the above two methods.

2. Materials and Methods

2.1. Experiment Design

The field experiment (Figure 1) was carried out at the Tangquan experimental station of the Nanjing institute of Soil Science, Chinese Academy of Sciences, Nanjing city, Jiangsu province, China (32°04′15″N, 118°28′21″E). The experimental station belongs to the subtropical monsoon climate zone. The average annual rainfall is 1102.2 mm. The average annual temperature is approximately 15.4 °C. The soil in the Tangquan experimental station was paddy soil with 22.26 g/kg organic matter, 1.31 g/kg total N, 15.41 mg/kg Olsen-P, and 146.4 mg/kg NH4OAc-K. In the experimental station, the rice cultivars were Wuyunjing 23 and Nanjing 5055, respectively in 2018 and 2019. Treatments consisted of five N fertilization rates with four replications in a total of 20 plots. All of the plots had dimensions of 4 m × 10 m. In Table 1, N0 represents no fertilizer was applied; N1 represents conventional fertilizer; N2, N3, and N4 represents N mixed by urea and coated urea with 30%, 40%, and 50% slow-release N respectively. In 2018, the rice plants were transplanted on 12 June and harvested on 22 October. In 2019, the rice plants were transplanted on 12 June and harvested on 12 November.

2.2. Data Collection

2.2.1. Determination of PNC

The ground destructive samplings were carried out at the tillering, jointing, and flowering stages (Figure 1 and Table 2). One hill of rice was destructively and randomly sampled within the sampling region of each plot at each sampling time (Figure 1). All of the plant samples were oven-dried at 105 °C for 30 mins, followed by 80 °C until constant weight and then ground for chemical analysis in the laboratory. PNC (%) was determined by using the micro-Kjeldahl method [29].

2.2.2. Image Acquisition

The acquisition of UAV-based images was implemented on the same dates as those of rice plant sampling (Table 2). A consumer-grade camera mounted on a UAV (Phantom 4 Pro, SZ DJI Technology Co., Ltd., Shenzhen, China) was used in this study. The UAV had flight duration of about 30 mins, depending on the actual working conditions. The digital camera was equipped with a one-inch complementary metal-oxide semiconductor (CMOS) sensor, which had a spatial resolution of approximate 20 mega pixels. The UAV was flown over the rice fields at fight altitudes of 50 m above ground level (ground sampling distance: 1.36 cm). The flights were carried out in stable ambient light conditions between 11:00 am and 13:00 pm by setting an auto-exposure time. The UAV aviated automatically based on the preset flight waypoints, leading to approximately 80% forward overlap and 60% side overlap controlled by the Pix4Dcapture application.

2.3. Image Data Processing

2.3.1. Image Mosaic

The orthophotos of the experimental site were generated by the Pix4Dmapper Version 1.1.38 (https://www.pix4d.com/, accessed on 10 September in 2019), which provides an automated pipeline through steps of image alignment, matching, mosaicking, constructing dense point cloud, and finally generating orthoimages with the Geo-TIF format.

2.3.2. Calculation of VIs

Before calculation of VIs from the orthophotos at multi stages, an empirical line correction method was used to carry out radiometric calibration [30]. In addition, we used the optimal index method, proposed by Qiu et al. [31], to remove the background effects. Firstly, the VI which had the best correlation with PNC at each phenological stage was selected as the optimal vegetation index (OVI). Then, we used ArcGIS 10.1 (ESRI, Redlands, CA) to divide OVI into five levels based on the natural fault zone method. Finally, the three middle levels were selected as the canopy area. After background removal, ArcGIS 10.1 was used to draw the region of interest (ROI) of each plot in the non-sampling area (Figure 1). In this study, most of the selected color indices were well-known and have been studied in leaf chlorophyll content, biomass, and N status estimation [13,17,26,32]. The R, G, and B channels from ortho-images were used for calculating twelve VIs (Table 3). Then, we used the “Zonal Statistics As Table” function in ArcGIS 10.1 to extract the mean reflectance value of ROI in each plot.
Table 3. Summary of variables used in this study.
Table 3. Summary of variables used in this study.
Data typeVariablesEquation/DescriptionReference
RGB-VIsNRIR/(R+G+B)[33]
NGIG/(R+G+B)[33]
NBIB/(R+G+B)[33]
G/RG/R[34]
G/BG/B[34]
R/BR/B[34]
ExR(1.4R-G)/(G+R+B)[35]
ExG(2*G-R-B) /(G+R+B)[36]
GMRG-R[17]
INT(R+G+B)/3[37]
VARI(G-R)/(G+R-B)[38]
NGRDI(G-R)/(G+R)[13]
Color momentsHThe average of hue[27]
H_varThe variance of hue[27]
H_skeThe skewness of hue[27]
SThe average of saturation[27]
S_varThe variance of saturation[27]
S_skeThe skewness of saturation[27]
VThe average of value[27]
V_varThe variance of value[27]
V_skeThe skewness of value[27]

2.3.3. Calculation of Color Moments

Due to the information of color distribution being mainly concentrated in low order moments, only the first, second, and third order moments of color could be used to express the color distribution. Compared with color histograms which divide the color space into several small intervals and calculate the number of pixels in each interval, another advantage of this method was that it did not need color space quantization and had lower dimensions of feature vector. Accordingly, 9 components (3 color components, 2 lower order moments on each component) could cover the color moments of an image, which is very concise compared with other color features. The color moments were calculated as follows [39]:
μ i = 1 N j = 1 N p i , j
σ i = 1 N j = 1 N p i , j μ i 2 1 2
s i = 1 N j = 1 N p i , j μ i 3 1 3
where p i , j represents the probability of the pixels of the gray value with j in i t h color channel. N was the total number pixels in the ROI of each plot (Figure 1). The entries μ i ( 1 i 3 ) represent the average color in each color channel. The entries σ i and s i represent the variance and skewness of each color channel, respectively. In this study, we used the HSV color space, and then the nine color features which are expressed in Table 3.

2.4. Algorithms of Multivariate Regression Model

2.4.1. PLSR

PLSR is one of the most common multivariate regression algorithms applied to deal with data with collinear variables. PLSR was inherent in the multiple linear regression (MLR) and able to avoid multi-collinearity [40]. In this study, we used PLSR to examine the relationships of RGB-VIs and color moments with the rice PNC. The number of latent variables was a critical parameter in the PLSR algorithm, and was determined following the method described by Fassio, Cozzolino et al. [41]. The criterion of the selected latent variables was based on the minimum predicted residual sum of squares (PRESS) [42] with 10-fold cross validation in calibration datasets.

2.4.2. Random Forests

The algorithm of Random forests was developed by Breiman et al. [43] and shows a promising capability to avoid overfitting by sampling the predictor space randomly. It can construct non-linear relationships without the limitations of the assumptions of variable distributions and dependency [12]. In addition, RF could effectively evaluate the importance of independent variables and partly deal with multicollinearity among variables while having great tolerance to noises and outliers. In this study, we implemented the RF in the python 3.6 environment using the “RandomForestRegressor” function in “sklearn” package. Three tuning parameters (max_depth, min_samples_split, min_samples_leaf) were implemented in RF by using “GridSearchCV” module with 10-fold cross validation in calibration datasets. Other parameters were set to default values.

2.5. Statistical Analysis

In this study, we used the data in 2019 as the calibration set and the data in 2018 as the validation set (Figure 2). Before establishment of estimation models, the correlations between VIs, color moments, and PNC were analyzed using Pearson’s correlation coefficient.
Paired t-tests was conducted to analyze the variations between the measured and estimated PNC values in calibration and validation datasets by using “ttest_real” function from “scipy” package in python 3.6 environment. The means were compared at the 5% level of significance by the t-tests [44]. The fitness was assessed from a 1:1 line of the estimated and measured PNC values. The performance of models to estimate PNC was evaluated by comparing the differences in coefficient of determination (R2) in Equation (4) and normalized root mean square error (NRMSE) in Equation (5). These statistical indicators were expressed as follow:
R 2 = 1 i = 1 n P i O i 2 i = 1 n P i O ¯ 2
N R M S E = 100 O ¯ × 1 n × i = 1 n P i O i 2
where n was the number of observations, O ¯ denoted the average value of measured PNC, P i and O i were the estimated and observed values of PNC, respectively. Generally, the simulation was considered excellent when NRMSE was less than 10%, good if NRMSE was greater than 10% and less than 20%, fair if NRMSE was greater than 20% and less than 30%, and poor if NRMSE was greater than 30% [45].

3. Results

3.1. Descriptive Analysis of Measured PNC Data

Across growth stages, years, and applied N rates, the PNC of rice ranged from 0.7% to 3.5% (Table 4). Because of the N “dilution effect” [46], the average values of the observed PNC decreased from the tillering stage to the flowering stage in the calibration and validation sets. In the calibration set, the average PNC values decreased from 2.3% at the tillering stage and 1.7% at the jointing stage to 1.1% at the flowering stage. In the validation set, the average PNC values were close to those in the calibration set at the same growth stage which decreased from 2.5% at the tillering stage and 1.8% at the jointing stage to 1.0% at the flowering stage.

3.2. Correlations between RGB-VIs, Color Moments and PNC

To evaluate the performance of the 12 RGB-VIs and 9 color moments obtained from the UAV-based images, the Pearson correlation analysis between PNC and these variables was implemented across the growth stages (Figure 3). At the tillering stage, high correlations were found between RGB-VIs and rice PNC such as NRI (r = −0.77) and VARI (r = 0.77). Except for S_var (p > 0.05), other color moments were significantly correlated with PNC and the absolute values of r ranged from 0.58 to 0.83. At the jointing stage, very strong correlations were found between RGB-VIs and PNC such as G/R (r = 0.89) and NGRDI (r = 0.89). Color moments showed slightly lower correlations with PNC. The top three color moments were V, S, H and the absolute values of r ranged from 0.67 to 0.88. At the flowering stage, G/R revealed the highest correlation with PNC reaching 0.84 among all RGB-VIs. In addition, H had the highest correlation with PNC reaching 0.77 among color moments.
Among the VIs and color moments, some variables were constantly well correlated with PNC across the growth stages such as G/R, NGRDI, INT, ExR, H, and V with r values ranging from 0.74 to 0.89, 0.74 to 0.89, −0.7 to −0.87, −0.72 to −0.85, 0.67 to 0.77 and −0.64 to −0.88, respectively. However, for some other variables, such as NRI, NGI, VARI and H_ske (Figure 3), the correlations changed randomly even when they had a good correlation with PNC at the three individual stages.

3.3. PLSR Analysis

Figure 4 demonstrates the change process of the mean PRESS values with the increase of latent components by using the 10-fold cross-validation method in the calibration datasets. When the number of PLSR components increased, the mean PRESS values decreased first of all, then increased slowly, and remained relatively stable finally except for the flowering stage. When using all the RGB-VIs as input variables, the PLSR models achieved minimum mean PRESS values with 2, 2, 2, and 3 components, respectively at the tillering, jointing, flowering, and combined stages. While using all the color moments as inputs, the PLSR models attained the minimum mean PRESS values with 5, 4, 1, and 3 components, respectively at the tillering, jointing, flowering, and combined stages. When combining all the RGB-VIs and color moments as inputs, the minimum mean PRESS values were achieved by the PLSR models with 6, 2, 6, and 6 components at the tillering, jointing, flowering, and combined stages, respectively.
Table 5 shows the results from the PLSR models. In the calibration datasets, the PLSR models using color moments as inputs obtained slightly better results than those using RGB-VIs as inputs, with R2 ranged from 0.72 to 0.89 and NRMSE ranged from 0.10 to 0.18 across the single stages and combined stages. Except for the jointing stage, the PLSR using all variables performed best, with R2 values of 0.79, 0.80. and 0.83, respectively at the tillering, flowering, and combined stages. In validation datasets, the models using all variables showed higher accuracy and lower NRMSE values compared to the models using either RGB-VIs variables or color moment variables. The values of R2 and NRMSE ranged from 0.68 to 0.87 and 0.10 to 0.29, respectively at the single and combined stages. In addition, the color moments only type of model showed lower R2 values (0.32 and 0.33) compared to the model using only RGB-VIs variables (R2 = 0.63 and 0.60), respectively at the tillering and flowering stages.
Scatterplots of estimated and measured values for PNC across single and combined stages are presented in Figure 5. The black line is the 1:1 line used to observe the distribution of scatters. At the flowering stage, the PLSR model showed over-estimated PNC values at high level of the measured PNC values in the validation dataset when using either RGB-VIs or color moment variables (Figure 5c,g). The paired t test analysis showed significant difference (p < 0.05) between the estimated and measured PNC in the validation datasets at flowering stages for models using RGB-VIs only or color moments only under no N stress treatment. However, the overestimation of PNC values could be significantly eliminated (p > 0.05) by the PLSR model when considering all variables (Figure 5k). Generally, the distributions of scatters obtained from models using all variables as inputs were closer to the 1:1 line compared to those of scatters obtained from models using RGB-VIs only or color moments only as inputs.

3.4. RF Analysis

Before estimating the PNC by RF models, we used the method of grid search with 10-fold cross validation to obtain the optimal model parameter sets in the calibration datasets (Table 6). Then, the top five important variables for rice PNC estimation for all RF models are presented in Table 7. For the RF models based on only RGB-VIs, the rank of important variables was inconsistent across the growth stages. At the tillering stage, the difference of the importance values was relatively small and the values of importance ranged from 0.1 to 0.16. At the flowering and combined stages, the most important variables were ExR and NRI, respectively. The values of importance (0.35 and 0.44) of these two variables showed a great influence for PNC estimation. For the RF models based on only color moments, the two most important variables had a dominant effect on the model performance at the jointing, flowering and combined stages while the importance values reached over 0.79. For the models including all variables as inputs, there was at least one out of the five most important variables which belonged to the color moments at the single and combined stages.
Table 8 displays the calibration and validation statistic of estimated PNC from the RF models. In calibration datasets, the RF models including only RGB-VIs performed well with R2 ranging from 0.79 to 0.93 and NRMSE from 0.10 to 0.19. Only color moments as inputs could get similar results to those using only RGB-VIs as inputs, with R2 ranging from 0.73 to 0.94 and NRMSE from 0.10 to 0.16 across the single and combined stages. The models including RGB-VIs and color moments yielded superior performance compared to the models using either RGB-VIs or color moments. The values of R2 and NRMSE ranged from 0.83 to 0.95 and 0.08 to 0.15, respectively at the single and combined stages. In the validation datasets, all the models obtained slightly lower R2 values compared to those in the calibration datasets. In contrast, the NRMSE values were still low and ranged from 0.07 to 0.11 across the single stages. In addition, the fusion of RGB-VIs and color moments improved the accuracy of the PNC estimation with R2 ranging from 0.69 to 0.91 and NRMSE from 0.07 to 0.13 compared to those models using RGB-VIs only or color moments only as inputs. This indicated the RF models could effectively predict PNC in rice.
Figure 6 displays the scatterplots of the estimated and measured values for PNC across the single and combined stages. In the calibration datasets, the distributions of scatters are close to the 1:1 line. However, a slight overestimation of PNC values occurred in the low level of PNC as indicated by the deviation of the 1:1 line when using only color moments as inputs at the flowering and combined stages in the validation datasets (Figure 6g,h). There was significant difference (p < 0.05) between the estimated and measured PNC when using color moments as inputs for the RF model in the validation dataset at the flowering stage under different N conditions. When using all variables as inputs, the distributions of scatters obtained from the RF models were close to the 1:1 line in the calibration and validation datasets (Figure 6 i,j,k,l).

4. Discussion

4.1. Comparisons of the PLSR and RF Models

In this study, the performance of the PLSR and RF models was tested to assess rice PNC under varying N fertilizer application rates at different growth stages. These two kinds of models were multivariate regression methods and good at dealing with a lot of predicators which were cross-correlated [25,47]. Based on the results of Table 5 and Table 8, the RF regression models outperformed the PLSR regression models for PNC estimation in the same validation dataset. In similar studies, Maimaitijiang et al. [48] compared the performance of PLSR, RF, extreme learning regression (ELR), and support vector regression (SVR) in estimating crop LNC, using satellite-based VIs and UAV-based canopy structure information as inputs. Their results showed that the RF model performed best for N estimation. Osco et al. [49] evaluated the performance of nine machine learning models for maize LNC predictions using UAV-based multispectral imagery. Their results indicated that the RF model performed better than other models for LNC. Liang et al. [50] also concluded that the RF model was preferred to predict LNC compared to the least square support vector model (LS-SVR). Our results were consistent with the aforementioned studies. Accordingly, the RF model based on decision tree algorithms is more robust and appropriate to assess crop N status compared to other machine learning algorithms.

4.2. Fusion of RGB-VIs and Color Moments for PNC Estimation

Many previous studies proposed new VIs to improve crop N status estimation in their publications [7,51]. However, some other studies also concluded that these VIs had poor and unstable performance in N status estimation [4,52,53]. In recent years, texture indices have been successfully used to monitor crop growth status for precision agriculture [4,25]. Compared with texture or shape information, it was found that it was easier to extract color moments [39]. In addition, color moments based on color distribution features could be matched more efficiently and robustly than color histograms [27]. In this study, the color moments were for the first time investigated for crop N estimation. Generally, the combination of color feature information (color moments) and RGB-VIs derived from UAV-based imagery improved the performance of PLSR and RF models for PNC estimation at the single and combined stages (Table 5 and Table 8).
In the PLSR models, the number of latent components was the only tuning parameter. The PLSR models which used VIs or color moments as inputs, tended to obtain minimum Mean PRESS values when the numbers of components were between 2 and 5 (Figure 4a,b). When combing VIs and color moments as inputs, the optimal number of components was approximately six at the individual and combined stages (Figure 4c). This indicated that when using multiple types of variables as inputs, more valuable components would be screened out by the PLSR models to improve model accuracy compared to the model with a single type of variables as inputs. From the rank of the top five feature importance values in the RF models, there was at least one feature variable from color moments at the single and combined stages (Table 7). Additionally, the first three moments of H demonstrated a considerable contribution in improving model performance in RF models at the tillering stages when using all variables as inputs (Table 7). Thus, the models when adding color moments had the potential to improve the accuracy of crop N status estimation. Although, the fusion of color moments and RGB-VIs reduced the NRMSE values to a certain extent, the NRMSE values of RF and PLSR models still reached 0.15 to 0.17 and 0.13 to 0.29, respectively at the combined stages in the calibration and validation datasets. This was partly due to the changing variation of the morphological characteristics in bringing in several variables (NRI, NGI, or VARI) which changed their correlation at the different phenological stages though they were well correlated with PNC (Figure 3). When these variables were included as inputs at the combined stages, the model resulted in larger uncertainty compared to models based on the single stage.

4.3. Implications of UAV-Based RGB Imagery for Crop Monitoring

In recent decades, various sensors mounted on UAVs have been widely used to monitor crop growth traits such as LiDAR [54], hyperspectral [55] and multispectral [26] devices, NIR [56,57,58], and RGB cameras [59,60]. Compared to other sensors, the UAV-RGB camera was cheaper and lighter which made it possible for a longer working time to apply to larger areas in precision agriculture. With the advantage of ultrahigh-ground-resolution, the UAV-RGB images could be used for precision classification of crops and estimation of crop biochemical parameters. In addition, several previous studies demonstrated that the Digital Surface Models (DSMs) derived from UAV-RGB imagery were efficient in extracting a vegetation canopy structure [61,62,63]. Thus, the ultra-high resolution images from UAVs make it more powerful for field-scale applications in precision agriculture. In this study, we used the data fusion of color moments and VIs from UAV-RGB imagery to estimate rice PNC across cultivars and growth stages. The performance of the models was comparable to previous studies [3,23].
Although the performances of RF models were better by fusion of color moments and RGB-VIs than those of PLSR models, the other promising methods such as ensemble models [64], Long Short-Term Memory (LSTM) [65,66], and convolutional neural network (CNN) [67,68] need to be further explored to improve the accuracy of crop N estimation. Furthermore, the imageries over multiple years in more locations with different soil characteristics and climate conditions are essential for practical application.

5. Conclusions

The incorporation of color moments for both PLSR and RF models yielded more accurate estimations of PNC in rice with NRMSE reducing by 9% to 58% compared to results of models based on RGB-VIs across single and combined stages. The RF models with VIs and color moments showed a similar performance to the PLSR models for PNC estimation. The accuracies of RF models based on color moments only (R2 = 0.57–0.75 and NRMSE = 0.08–0.11) were comparable to results of RF models using VIs only (R2 = 0.62–0.82 and NRMSE = 0.08–0.09) at single stages in validation datasets. The analysis of important variables in RF models showed that color moments played an important role in PNC estimation, which demonstrated that color moments significantly improved the estimation of rice PNC. Future work should be undertaken to further improve the accuracy of the model for monitoring crop growth status by combining the texture, color features, structure from motion (sfM), and VIs. Meanwhile, these features are easily and readily extracted from UAV-based images.

Author Contributions

Conceptualization, H.G.; methodology, H.G.; investigation, C.D., H.X., H.G., F.M., Z.L., Z.Q. and Z.T.; resources, C.D.; data curation, H.G.; writing—original draft preparation, H.G.; writing—review and editing, C.D.; visualization, H.G.; supervision, C.D and H.X. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Key R&D Plan of Shandong Province (2019JZZY010713), the Project of China-Europe Cooperation Project (Grant NO: 2018YFE01070008ASP462) and the “STS” Project from Chinese Academy of Sciences (KFJ-STS-QYZX-047).

Data Availability Statement

Data sharing not applicable.

Acknowledgments

Thanks to Ke Wu for providing the detail information of the experimental design in this study.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Cantrell, R.P. The Rice Genome: The Cereal of the World’s Poor Takes Center Stage. Science 2002, 296, 53. [Google Scholar] [CrossRef] [PubMed]
  2. Huang, S.; Miao, Y.; Zhao, G.; Yuan, F.; Ma, X.; Tan, C.; Yu, W.; Gnyp, M.L.; Lenz-Wiedemann, V.I.; Rascher, U.; et al. Satellite Remote Sensing-Based In-Season Diagnosis of Rice Nitrogen Status in Northeast China. Remote Sens. 2015, 7, 10646–10667. [Google Scholar] [CrossRef] [Green Version]
  3. Zheng, H.; Cheng, T.; Li, D.; Yao, X.; Tian, Y.; Cao, W.; Zhu, Y. Combining Unmanned Aerial Vehicle (UAV)-Based Multispectral Imagery and Ground-Based Hyperspectral Data for Plant Nitrogen Concentration Estimation in Rice. Front. Plant Sci. 2018, 9. [Google Scholar] [CrossRef]
  4. Zheng, H.; Ma, J.; Zhou, M.; Li, D.; Yao, X.; Cao, W.; Zhu, Y.; Cheng, T. Enhancing the Nitrogen Signals of Rice Canopies across Critical Growth Stages through the Integration of Textural and Spectral Information from Unmanned Aerial Vehicle (UAV) Multispectral Imagery. Remote Sens. 2020, 12, 957. [Google Scholar] [CrossRef] [Green Version]
  5. Padilla, F.M.; Peña-Fleitas, M.T.; Gallardo, M.; Thompson, R.B. Evaluation of optical sensor measurements of canopy reflectance and of leaf flavonols and chlorophyll contents to assess crop nitrogen status of muskmelon. Eur. J. Agron. 2014, 58, 39–52. [Google Scholar] [CrossRef]
  6. Miphokasap, P.; Honda, K.; Vaiphasa, C.; Souris, M.; Nagai, M. Estimating Canopy Nitrogen Concentration in Sugarcane Using Field Imaging Spectroscopy. Remote Sens. 2012, 4, 1651–1670. [Google Scholar] [CrossRef] [Green Version]
  7. Tian, Y.; Yao, X.; Yang, J.; Cao, W.; Hannaway, D.; Zhu, Y. Assessing newly developed and published vegetation indices for estimating rice leaf nitrogen concentration with ground- and space-based hyperspectral reflectance. Field Crop. Res. 2011, 120, 299–310. [Google Scholar] [CrossRef]
  8. Wang, W.; Yao, X.; Yao, X.; Tian, Y.; Liu, X.; Ni, J.; Cao, W.; Zhu, Y. Estimating leaf nitrogen concentration with three-band vegetation indices in rice and wheat. Field Crop. Res. 2012, 129, 90–98. [Google Scholar] [CrossRef]
  9. Kalacska, M.; Lalonde, M.; Moore, T. Estimation of foliar chlorophyll and nitrogen content in an ombrotrophic bog from hyperspectral data: Scaling from leaf to image. Remote Sens. Environ. 2015, 169, 270–279. [Google Scholar] [CrossRef]
  10. Lepine, L.C.; Ollinger, S.V.; Ouimette, A.P.; Martin, M.E. Examining spectral reflectance features related to foliar nitrogen in forests: Implications for broad-scale nitrogen mapping. Remote Sens. Environ. 2016, 173, 174–186. [Google Scholar] [CrossRef]
  11. Crema, A.; Boschetti, M.; Nutini, F.; Cillis, D.; Casa, R. Influence of Soil Properties on Maize and Wheat Nitrogen Status Assessment from Sentinel-2 Data. Remote Sens. 2020, 12, 2175. [Google Scholar] [CrossRef]
  12. Loozen, Y.; Rebel, K.T.; de Jong, S.M.; Lu, M.; Ollinger, S.V.; Wassen, M.J.; Karssenberg, D. Mapping canopy nitrogen in European forests using remote sensing and environmental variables with the random forests method. Remote Sens. Environ. 2020, 247, 111933. [Google Scholar] [CrossRef]
  13. Li, Y.; Chen, D.; Walker, C.; Angus, J. Estimating the nitrogen status of crops using a digital camera. Field Crop. Res. 2010, 118, 221–227. [Google Scholar] [CrossRef]
  14. Bendig, J.; Yu, K.; Aasen, H.; Bolten, A.; Bennertz, S.; Broscheit, J.; Gnyp, M.L.; Bareth, G. Combining UAV-based plant height from crop surface models, visible, and near infrared vegetation indices for biomass monitoring in barley. Int. J. Appl. Earth Obs. Geoinf. 2015, 39, 79–87. [Google Scholar] [CrossRef]
  15. Yang, G.; Liu, J.; Zhao, C.; Li, Z.; Huang, Y.; Yu, H.; Xu, B.; Yang, X.; Zhu, D.; Zhang, X.; et al. Unmanned Aerial Vehicle Remote Sensing for Field-Based Crop Phenotyping: Current Status and Perspectives. Front. Plant Sci. 2017, 8, 1111. [Google Scholar] [CrossRef]
  16. Sidike, P.; Sagan, V.; Qumsiyeh, M.; Maimaitijiang, M.; Essa, A.; Asari, V. Adaptive Trigonometric Transformation Function with Image Contrast and Color Enhancement: Application to Unmanned Aerial System Imagery. IEEE Geosci. Remote Sens. Lett. 2018, 15, 404–408. [Google Scholar] [CrossRef]
  17. Wang, Y.; Wang, D.; Zhang, G.; Wang, J. Estimating nitrogen status of rice using the image segmentation of G-R thresholding method. Field Crop. Res. 2013, 149, 33–39. [Google Scholar] [CrossRef]
  18. Babel, M.; Agarwal, A.; Swain, D.; Herath, S. Evaluation of climate change impacts and adaptation measures for rice cultivation in Northeast Thailand. Clim. Res. 2011, 46, 137–146. [Google Scholar] [CrossRef] [Green Version]
  19. Fanourakis, D.; Briese, C.; Max, J.F.; Kleinen, S.; Putz, A.; Fiorani, F.; Ulbrich, A.; Schurr, U. Rapid determination of leaf area and plant height by using light curtain arrays in four species with contrasting shoot architecture. Plant Methods 2014, 10, 9. [Google Scholar] [CrossRef] [Green Version]
  20. Golzarian, M.R.; Frick, R.A.; Rajendran, K.; Berger, B.; Roy, S.; Tester, M.; Lun, D.S. Accurate inference of shoot biomass from high-throughput images of cereal plants. Plant Methods 2011, 7, 2. [Google Scholar] [CrossRef] [Green Version]
  21. Bendig, J.; Bolten, A.; Bennertz, S.; Broscheit, J.; Eichfuss, S.; Bareth, G. Estimating Biomass of Barley Using Crop Surface Models (CSMs) Derived from UAV-Based RGB Imaging. Remote Sens. 2014, 6, 10395–10412. [Google Scholar] [CrossRef] [Green Version]
  22. Jiang, J.; Cai, W.; Zheng, H.; Cheng, T.; Tian, Y.; Zhu, Y.; Ehsani, R.; Hu, Y.; Niu, Q.; Gui, L.; et al. Using Digital Cameras on an Unmanned Aerial Vehicle to Derive Optimum Color Vegetation Indices for Leaf Nitrogen Concentration Monitoring in Winter Wheat. Remote Sens. 2019, 11, 2667. [Google Scholar] [CrossRef] [Green Version]
  23. Fu, Y.; Yang, G.; Li, Z.; Song, X.; Li, Z.; Xu, X.; Wang, P.; Zhao, C. Winter Wheat Nitrogen Status Estimation Using UAV-Based RGB Imagery and Gaussian Processes Regression. Remote Sens. 2020, 12, 3778. [Google Scholar] [CrossRef]
  24. Rorie, R.L.; Purcell, L.C.; Mozaffari, M.; Karcher, D.E.; King, C.A.; Marsh, M.C.; Longer, D.E. Association of “Greenness” in Corn with Yield and Leaf Nitrogen Concentration. Agron. J. 2011, 103, 529–535. [Google Scholar] [CrossRef] [Green Version]
  25. Li, S.; Yuan, F.; Ata-Ui-Karim, S.T.; Zheng, H.; Cheng, T.; Liu, X.; Tian, Y.; Zhu, Y.; Cao, W.; Cao, Q. Combining Color Indices and Textures of UAV-Based Digital Imagery for Rice LAI Estimation. Remote Sens. 2019, 11, 1763. [Google Scholar] [CrossRef] [Green Version]
  26. Yue, J.; Yang, G.; Tian, Q.; Feng, H.; Xu, K.; Zhou, C. Estimate of winter-wheat above-ground biomass based on UAV ultrahigh-ground-resolution image textures and vegetation indices. ISPRS J. Photogramm. Remote Sens. 2019, 150, 226–244. [Google Scholar] [CrossRef]
  27. Stricker, M.A.; Orengo, M. Similarity of color images. In Proceedings of the Storage and Retrieval for Image and Video Databases III—International Society Optical Engineering, San Jose, CA, USA, 23 March 1995; pp. 381–392. [Google Scholar]
  28. Li, W.; Niu, Z.; Chen, H.; Li, D.; Wu, M.; Zhao, W. Remote estimation of canopy height and aboveground biomass of maize using high-resolution stereo images from a low-cost unmanned aerial vehicle system. Ecol. Indic. 2016, 67, 637–648. [Google Scholar] [CrossRef]
  29. Page, A.L.; Miller, R.H.; Keeney, D.R. Methods of Soil Analysis: Chemical and Microbiological Properties; American Society of Agronomy: Madison, WI, USA, 1982. [Google Scholar]
  30. Yu, N.; Li, L.; Schmitz, N.; Tian, L.F.; Greenberg, J.A.; Diers, B.W. Development of methods to improve soybean yield estimation and predict plant maturity with an unmanned aerial vehicle-based platform. Remote Sens. Environ. 2016, 187, 91–101. [Google Scholar] [CrossRef]
  31. Qiu, Z.; Xiang, H.; Ma, F.; Du, C. Qualifications of Rice Growth Indicators Optimized at Different Growth Stages Using Unmanned Aerial Vehicle Digital Imagery. Remote Sens. 2020, 12, 3228. [Google Scholar] [CrossRef]
  32. Saberioon, M.; Amin, M.; Anuar, A.; Gholizadeh, A.; Wayayok, A.; Khairunniza-Bejo, S. Assessment of rice leaf chlorophyll content using visible bands at different growth stages at both the leaf and canopy scale. Int. J. Appl. Earth Obs. Geoinf. 2014, 32, 35–45. [Google Scholar] [CrossRef]
  33. Liu, K.; Li, Y.; Han, T.; Yu, X.; Ye, H.; Hu, H.; Hu, Z. Evaluation of grain yield based on digital images of rice canopy. Plant Methods 2019, 15, 28. [Google Scholar] [CrossRef] [PubMed]
  34. Maimaitijiang, M.; Sagan, V.; Sidike, P.; Maimaitiyiming, M.; Hartling, S.; Peterson, K.T.; Maw, M.J.; Shakoor, N.; Mockler, T.; Fritschi, F.B. Vegetation Index Weighted Canopy Volume Model (CVMVI) for soybean biomass estimation from Unmanned Aerial System-based RGB imagery. ISPRS J. Photogramm. Remote Sens. 2019, 151, 27–41. [Google Scholar] [CrossRef]
  35. Meyer, G.E.; Neto, J.C. Verification of color vegetation indices for automated crop imaging applications. Comput. Electron. Agric. 2008, 63, 282–293. [Google Scholar] [CrossRef]
  36. Woebbecke, D.M.; Meyer, G.E.; Vonbargen, K.; Mortensen, D.A. Color Indexes for Weed Identification under Various Soil, Residue, and Lighting Conditions. Trans. Asae 1995, 38, 259–269. [Google Scholar] [CrossRef]
  37. Ahmad, I.S.; Reid, J.F. Evaluation of Colour Representations for Maize Images. J. Agric. Eng. Res. 1996, 63, 185–195. [Google Scholar] [CrossRef]
  38. Sakamoto, T.; Gitelson, A.A.; Wardlow, B.D.; Arkebauer, T.J.; Verma, S.B.; Suyker, A.E.; Shibayama, M. Application of day and night digital photographs for estimating maize biophysical characteristics. Precis. Agric. 2012, 13, 285–301. [Google Scholar] [CrossRef] [Green Version]
  39. Stricker, M.A.; Dimai, A. Color Indexing with Weak Spatial Constraints. In Proceedings of the IS&T/SPIE’s Symposium on Electronic Imaging: Science and Technology, San Jose, CA, USA, 13 March 1996; pp. 29–40. [Google Scholar]
  40. Grossman, Y.; Ustin, S.; Jacquemoud, S.; Sanderson, E.; Schmuck, G.; Verdebout, J. Critique of stepwise multiple linear regression for the extraction of leaf biochemistry information from leaf reflectance data. Remote Sens. Environ. 1996, 56, 182–193. [Google Scholar] [CrossRef]
  41. Fassio, A.; Cozzolino, D. Non-destructive prediction of chemical composition in sunflower seeds by near infrared spectroscopy. Ind. Crop. Prod. 2004, 20, 321–329. [Google Scholar] [CrossRef]
  42. Nguyen, H.T.; Lee, B.-W. Assessment of rice leaf growth and nitrogen status by hyperspectral canopy reflectance and partial least square regression. Eur. J. Agron. 2006, 24, 349–356. [Google Scholar] [CrossRef]
  43. Breiman, L. Random Forests. Mach. Learn. 2001, 45, 5–32. [Google Scholar] [CrossRef] [Green Version]
  44. Enciso, J.; Avila, C.A.; Jung, J.; Elsayed-Farag, S.; Chang, A.; Yeom, J.; Landivar, J.; Maeda, M.; Chavez, J.C. Validation of agronomic UAV and field measurements for tomato varieties. Comput. Electron. Agric. 2019, 158, 278–283. [Google Scholar] [CrossRef]
  45. Loague, K. The impact of land use on estimates of pesticide leaching potential: Assessments and uncertainties. J. Contam. Hydrol. 1991, 8, 157–175. [Google Scholar] [CrossRef]
  46. Plénet, D.; Lemaire, G. Relationships between dynamics of nitrogen uptake and dry matter accumulation in maize crops. Determination of critical N concentration. Plant Soil 1999, 216, 65–82. [Google Scholar] [CrossRef]
  47. Cutler, D.R.; Edwards, T.C., Jr.; Beard, K.H.; Cutler, A.; Hess, K.T.; Gibson, J.; Lawler, J.J. Random Forests for Classification in Ecology. Ecology 2007, 88, 2783–2792. [Google Scholar] [CrossRef] [PubMed]
  48. Maimaitijiang, M.; Sagan, V.; Sidike, P.; Daloye, A.M.; Erkbol, H.; Fritschi, F.B. Crop Monitoring Using Satellite/UAV Data Fusion and Machine Learning. Remote Sens. 2020, 12, 1357. [Google Scholar] [CrossRef]
  49. Osco, L.; Junior, J.; Ramos, A.; Furuya, D.; Santana, D.; Teodoro, L.; Gonçalves, W.; Baio, F.; Pistori, H.; Junior, C.; et al. Leaf Nitrogen Concentration and Plant Height Prediction for Maize Using UAV-Based Multispectral Imagery and Machine Learning Techniques. Remote Sens. 2020, 12, 3237. [Google Scholar] [CrossRef]
  50. Liang, L.; Di, L.; Huang, T.; Wang, J.; Lin, L.; Wang, L.; Yang, M. Estimation of Leaf Nitrogen Content in Wheat Using New Hyperspectral Indices and a Random Forest Regression Algorithm. Remote Sens. 2018, 10, 1940. [Google Scholar] [CrossRef] [Green Version]
  51. Stroppiana, D.; Boschetti, M.; Brivio, P.A.; Bocchi, S. Plant nitrogen concentration in paddy rice from field canopy hyperspectral radiometry. Field Crop. Res. 2009, 111, 119–129. [Google Scholar] [CrossRef]
  52. Prey, L.; Schmidhalter, U. Sensitivity of Vegetation Indices for Estimating Vegetative N Status in Winter Wheat. Sensors 2019, 19, 3712. [Google Scholar] [CrossRef] [Green Version]
  53. Yu, K.; Li, F.; Gnyp, M.L.; Miao, Y.; Bareth, G.; Chen, X. Remotely detecting canopy nitrogen concentration and uptake of paddy rice in the Northeast China Plain. ISPRS J. Photogramm. Remote Sens. 2013, 78, 102–115. [Google Scholar] [CrossRef]
  54. Maesano, M.; Khoury, S.; Nakhle, F.; Firrincieli, A.; Gay, A.; Tauro, F.; Harfouche, A. UAV-Based LiDAR for High-Throughput Determination of Plant Height and Above-Ground Biomass of the Bioenergy Grass Arundo donax. Remote Sens. 2020, 12, 3464. [Google Scholar] [CrossRef]
  55. Bareth, G.; Aasen, H.; Bendig, J.; Gnyp, M.L.; Bolten, A.; Jung, A.; Michels, R.; Soukkamaki, J. Low-weight and UAV-based Hyperspectral Full-frame Cameras for Monitoring Crops: Spectral Comparison with Portable Spectroradiometer Measurements. Photogramm. Fernerkund. Geoinf. 2015, 1, 69–79. [Google Scholar] [CrossRef]
  56. Kopačková-Strnadová, V.; Koucká, L.; Jelének, J.; Lhotáková, Z.; Oulehle, F. Canopy Top, Height and Photosynthetic Pigment Estimation Using Parrot Sequoia Multispectral Imagery and the Unmanned Aerial Vehicle (UAV). Remote Sens. 2021, 13, 705. [Google Scholar] [CrossRef]
  57. Poley, L.G.; McDermid, G.J. A Systematic Review of the Factors Influencing the Estimation of Vegetation Aboveground Biomass Using Unmanned Aerial Systems. Remote Sens. 2020, 12, 1052. [Google Scholar] [CrossRef] [Green Version]
  58. Zhang, J.; Wang, C.; Yang, C.; Jiang, Z.; Zhou, G.; Wang, B.; Shi, Y.; Zhang, D.; You, L.; Xie, J. Evaluation of a UAV-mounted consumer grade camera with different spectral modifications and two handheld spectral sensors for rapeseed growth monitoring: Performance and influencing factors. Precis. Agric. 2020, 21, 1092–1120. [Google Scholar] [CrossRef]
  59. Hasan, U.; Sawut, M.; Chen, S. Estimating the Leaf Area Index of Winter Wheat Based on Unmanned Aerial Vehicle RGB-Image Parameters. Sustainability 2019, 11, 6829. [Google Scholar] [CrossRef] [Green Version]
  60. Niu, Y.; Zhang, L.; Zhang, H.; Han, W.; Peng, X. Estimating Above-Ground Biomass of Maize Using Features Derived from UAV-Based RGB Imagery. Remote Sens. 2019, 11, 1261. [Google Scholar] [CrossRef] [Green Version]
  61. De Castro, A.I.; Torres-Sánchez, J.; Peña, J.M.; Jiménez-Brenes, F.M.; Csillik, O.; López-Granados, F. An Automatic Random Forest-OBIA Algorithm for Early Weed Mapping between and within Crop Rows Using UAV Imagery. Remote Sens. 2018, 10, 285. [Google Scholar] [CrossRef] [Green Version]
  62. Fu, Y.; Yang, G.; Song, X.; Li, Z.; Xu, X.; Feng, H.; Zhao, C. Improved Estimation of Winter Wheat Aboveground Biomass Using Multiscale Textures Extracted from UAV-Based Digital Images and Hyperspectral Feature Analysis. Remote Sens. 2021, 13, 581. [Google Scholar] [CrossRef]
  63. Modica, G.; Messina, G.; De Luca, G.; Fiozzo, V.; Praticò, S. Monitoring the vegetation vigor in heterogeneous citrus and olive orchards. A multiscale object-based approach to extract trees’ crowns from UAV multispectral imagery. Comput. Electron. Agric. 2020, 175, 105500. [Google Scholar] [CrossRef]
  64. Shahhosseini, M.; Hu, G.; Archontoulis, S.V. Forecasting Corn Yield with Machine Learning Ensembles. Front. Plant Sci. 2020, 11, 1120. [Google Scholar] [CrossRef] [PubMed]
  65. Cao, J.; Zhang, Z.; Tao, F.; Zhang, L.; Luo, Y.; Zhang, J.; Han, J.; Xie, J. Integrating Multi-Source Data for Rice Yield Prediction across China using Machine Learning and Deep Learning Approaches. Agric. For. Meteorol. 2021, 297, 108275. [Google Scholar] [CrossRef]
  66. Cao, J.; Zhang, Z.; Luo, Y.; Zhang, L.; Zhang, J.; Li, Z.; Tao, F. Wheat yield predictions at a county and field scale with deep learning, machine learning, and google earth engine. Eur. J. Agron. 2021, 123, 126204. [Google Scholar] [CrossRef]
  67. Yang, Q.; Shi, L.; Han, J.; Zha, Y.; Zhu, P. Deep convolutional neural networks for rice grain yield estimation at the ripening stage using UAV-based remotely sensed images. Field Crop. Res. 2019, 235, 142–153. [Google Scholar] [CrossRef]
  68. Yang, Q.; Shi, L.; Han, J.; Yu, J.; Huang, K. A near real-time deep learning approach for detecting rice phenology based on UAV images. Agric. For. Meteorol. 2020, 287, 107938. [Google Scholar] [CrossRef]
Figure 1. Experiment site location (Nanjing city, Jiangsu Province, China) and investigated phenological stages in the study area.
Figure 1. Experiment site location (Nanjing city, Jiangsu Province, China) and investigated phenological stages in the study area.
Remotesensing 13 01620 g001
Figure 2. Flowchart of the experiment methodology. The acronyms “Cali” and “Vali” refer to calibration and validation, respectively.
Figure 2. Flowchart of the experiment methodology. The acronyms “Cali” and “Vali” refer to calibration and validation, respectively.
Remotesensing 13 01620 g002
Figure 3. Correlation analysis between RGB-VIs, color moments and PNC across growth stages: (ac) represent the correlation between PNC and RGB-VIs, respectively at the tillering, jointing, and flowering stages; (df) represent the correlation between PNC and color moments, respectively at the tillering, jointing, and flowering stages.
Figure 3. Correlation analysis between RGB-VIs, color moments and PNC across growth stages: (ac) represent the correlation between PNC and RGB-VIs, respectively at the tillering, jointing, and flowering stages; (df) represent the correlation between PNC and color moments, respectively at the tillering, jointing, and flowering stages.
Remotesensing 13 01620 g003
Figure 4. Mean PRESS values over the number of latent components for PLSR with 10-fold cross validation: (a) RGB-VIs only, (b) color moments only, (c) all variables.
Figure 4. Mean PRESS values over the number of latent components for PLSR with 10-fold cross validation: (a) RGB-VIs only, (b) color moments only, (c) all variables.
Remotesensing 13 01620 g004
Figure 5. Scatterplots between measured and estimated PNC values (%N) based on PLSR models in the calibration and validation datasets across single and combined stages. The embedding histogram in each subplot represents the means ± standard of error in different N treatments in the calibration and validation datasets. Note: “N0” represents N0 in Table 1; “Full N” represents no N stress treatment with the combination of N treatments ranging from N1 to N4 in Table 1.
Figure 5. Scatterplots between measured and estimated PNC values (%N) based on PLSR models in the calibration and validation datasets across single and combined stages. The embedding histogram in each subplot represents the means ± standard of error in different N treatments in the calibration and validation datasets. Note: “N0” represents N0 in Table 1; “Full N” represents no N stress treatment with the combination of N treatments ranging from N1 to N4 in Table 1.
Remotesensing 13 01620 g005
Figure 6. Scatterplots between measured and estimated PNC values (%N) based on RF models in the calibration and validation datasets across the growth stages. The embedding histogram in each subplot represents the means ± standard of error in different N treatments in the calibration and validation datasets. Note: “N0” represents N0 in Table 1; “Full N” represents no N stress treatment with the combination of N treatments ranging from N1 to N4 in Table 1.
Figure 6. Scatterplots between measured and estimated PNC values (%N) based on RF models in the calibration and validation datasets across the growth stages. The embedding histogram in each subplot represents the means ± standard of error in different N treatments in the calibration and validation datasets. Note: “N0” represents N0 in Table 1; “Full N” represents no N stress treatment with the combination of N treatments ranging from N1 to N4 in Table 1.
Remotesensing 13 01620 g006
Table 1. Rice cultivars and fertilizer rates in the experiment.
Table 1. Rice cultivars and fertilizer rates in the experiment.
TreatmentCultivarFertilizer Rate (kg/ha)
NP2O5K2O
N0Wuyunjing 23 (2018)
Nanjing 5055 (2019)
000
N124060120
N2240 (30%) 160120
N3240 (40%) 260120
N4240 (50%) 360120
1,2,3 The numbers inside the brackets represent the percentage of N from coated urea in total N rate (urea plus coated urea). The release duration of coated urea was 3 months, which was provided by the Jiangsu ISSAS fertilizer company (Yizhen, China).
Table 2. Sampling information of the experiment site.
Table 2. Sampling information of the experiment site.
YearUAV Flight DateSampling DateGrowth Stage
201819 July19 JulyTillering
11 August11 AugustJointing
9 September9 SeptemberFlowering
201914 July14 JulyTillering
12 August12 AugustJointing
8 September8 SeptemberFlowering
Table 4. Descriptive statistics of PNC (%N) measurements.
Table 4. Descriptive statistics of PNC (%N) measurements.
DatasetStagesPNC (%N)
MinMaxMeanSD
Calibration
(2019)
Tillering1.43.52.30.6
Jointing0.92.61.70.5
Flowering0.71.41.10.2
Validation
(2018)
Tillering2.03.12.50.3
Jointing1.12.41.80.3
Flowering0.81.21.00.1
Table 5. Results of the PLSR models for rice PNC (%N) estimation including only RGB-VIs, only color moments, and all variables in calibration and validation datasets.
Table 5. Results of the PLSR models for rice PNC (%N) estimation including only RGB-VIs, only color moments, and all variables in calibration and validation datasets.
StagesDatasetCalibrationValidation
R2NRMSER2NRMSE
TilleringRGB-VIs0.620.160.630.11
Color moments0.720.140.320.12
All variables0.790.120.680.10
JointingRGB-VIs0.800.130.810.29
Color moments0.890.100.800.28
All variables0.840.120.750.24
FloweringRGB-VIs0.710.110.600.36
Color moments0.750.100.331.22
All variables0.770.100.730.15
Combined stagesRGB-VIs0.800.190.840.30
Color moments0.810.180.500.41
All variables0.830.170.870.29
Table 6. Description of tuning parameters in the RF models with 10-fold cross validation in calibration datasets.
Table 6. Description of tuning parameters in the RF models with 10-fold cross validation in calibration datasets.
ParameterDescriptionRangeStagesModel
RGB-VIs OnlyColor Moments OnlyAll Variables
max_depthThe maximum depth of the tree2−6Tillering236
Jointing562
Flowering622
All664
min_samples_splitThe minimum number of samples required to split an internal node2−8Tillering242
Jointing422
Flowering444
All222
min_samples_leafThe minimum number of samples required to be at a leaf node1−12Tillering484
Jointing426
Flowering428
All642
Table 7. Top 5 important variables and corresponding values of PNC estimation for all RF models across single and combined stages: RGB-VIs only, color moments only, and all variables (RGB-VIs + color moments).
Table 7. Top 5 important variables and corresponding values of PNC estimation for all RF models across single and combined stages: RGB-VIs only, color moments only, and all variables (RGB-VIs + color moments).
ModelTilleringJointingFloweringCombined Stages
VariableImportanceVariableImportanceVariableImportanceVariableImportance
RGB-VIs onlyNRI0.160INT0.209ExR0.353NRI0.439
GMR0.127ExR0.177NGRDI0.182VARI0.146
R/B0.126NGI0.145G/R0.113G/R0.104
INT0.120NRI0.136NGI0.108NGRDI0.097
NGRDI0.104VARI0.117INT0.092G/B0.067
Color moments onlyH0.269V0.632H0.706V0.562
V0.192S0.183H_var0.086H0.315
H_ske0.174H0.139S_ske0.081S_var0.025
H_var0.113S_ske0.025V0.032S_ske0.022
S_ske0.103V_var0.005H_ske0.032H_var0.019
All variablesH0.125ExR0.231ExR0.306NRI0.420
R/B0.093G/R0.136NGRDI0.166G/R0.125
H_ske0.088NRI0.126G/R0.160VARI0.095
NRI0.085VARI0.099H_var0.101NGRDI0.066
H_var0.080V0.088NGI0.091V0.062
Table 8. RF models for rice PNC (%N) estimation including only RGB-VIs, only color moments, and all variables in calibration and validation datasets.
Table 8. RF models for rice PNC (%N) estimation including only RGB-VIs, only color moments, and all variables in calibration and validation datasets.
StagesDatasetCalibrationValidation
R2NRMSER2NRMSE
TilleringRGB-VIs0.860.110.620.08
Color moments0.730.150.570.08
All variables0.890.100.690.07
JointingRGB-VIs0.880.100.820.09
Color moments0.900.100.750.10
All variables0.930.080.840.08
FloweringRGB-VIs0.790.100.630.09
Color moments0.730.110.590.11
All variables0.830.090.710.08
Combined stagesRGB-VIs0.930.190.890.15
Color moments0.940.160.541.46
All variables0.950.150.910.13
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Ge, H.; Xiang, H.; Ma, F.; Li, Z.; Qiu, Z.; Tan, Z.; Du, C. Estimating Plant Nitrogen Concentration of Rice through Fusing Vegetation Indices and Color Moments Derived from UAV-RGB Images. Remote Sens. 2021, 13, 1620. https://0-doi-org.brum.beds.ac.uk/10.3390/rs13091620

AMA Style

Ge H, Xiang H, Ma F, Li Z, Qiu Z, Tan Z, Du C. Estimating Plant Nitrogen Concentration of Rice through Fusing Vegetation Indices and Color Moments Derived from UAV-RGB Images. Remote Sensing. 2021; 13(9):1620. https://0-doi-org.brum.beds.ac.uk/10.3390/rs13091620

Chicago/Turabian Style

Ge, Haixiao, Haitao Xiang, Fei Ma, Zhenwang Li, Zhengchao Qiu, Zhengzheng Tan, and Changwen Du. 2021. "Estimating Plant Nitrogen Concentration of Rice through Fusing Vegetation Indices and Color Moments Derived from UAV-RGB Images" Remote Sensing 13, no. 9: 1620. https://0-doi-org.brum.beds.ac.uk/10.3390/rs13091620

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop