Next Article in Journal
Analysis of Airborne Optical and Thermal Imagery for Detection of Water Stress Symptoms
Next Article in Special Issue
Three-Dimensional Reconstruction of Soybean Canopies Using Multisource Imaging for Phenotyping Analysis
Previous Article in Journal
Imaging Multi-Age Construction Settlement Behaviour by Advanced SAR Interferometry
Previous Article in Special Issue
Quantitative Estimation of Wheat Phenotyping Traits Using Ground and Aerial Imagery
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Comparison of Crop Parameters Estimation Using Images from UAV-Mounted Snapshot Hyperspectral Sensor and High-Definition Digital Camera

1
Key Laboratory of Quantitative Remote Sensing in Agriculture of Ministry of Agriculture China, Beijing Research Center for Information Technology in Agriculture, Beijing 100097, China
2
International Institute for Earth System Science, Nanjing University, Nanjing 210023, China
3
Environnement Méditerranéen et Modélisation des Agro-Hydrosystèmes, l’Institut National de Recherche Agronomique, 84914 Avignon, France
4
Institute of Geographical Sciences and Natural Resources Research, Chinese Academy of Sciences, Beijing 100101, China
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Submission received: 30 May 2018 / Revised: 9 July 2018 / Accepted: 9 July 2018 / Published: 18 July 2018

Abstract

:
Timely and accurate estimates of crop parameters are crucial for agriculture management. Unmanned aerial vehicles (UAVs) carrying sophisticated cameras are very pertinent for this work because they can obtain remote-sensing images with higher temporal, spatial, and ground resolution than satellites. In this study, we evaluated (i) the performance of crop parameters estimates using a near-surface spectroscopy (350~2500 nm, 3 nm at 700 nm, 8.5 nm at 1400 nm, 6.5 nm at 2100 nm), a UAV-mounted snapshot hyperspectral sensor (450~950 nm, 8 nm at 532 nm) and a high-definition digital camera (Visible, R, G, B); (ii) the crop surface models (CSMs), RGB-based vegetation indices (VIs), hyperspectral-based VIs, and methods combined therefrom to make multi-temporal estimates of crop parameters and to map the parameters. The estimated leaf area index (LAI) and above-ground biomass (AGB) are obtained by using linear and exponential equations, random forest (RF) regression, and partial least squares regression (PLSR) to combine the UAV based spectral VIs and crop heights (from the CSMs). The results show that: (i) spectral VIs correlate strongly with LAI and AGB over single growing stages when crop height correlates positively with AGB over multiple growth stages; (ii) the correlation between the VIs multiplying crop height and AGB is greater than that between a single VI and crop height; (iii) the AGB estimate from the UAV-mounted snapshot hyperspectral sensor and high-definition digital camera is similar to the results from the ground spectrometer when using the combined methods (i.e., using VIs multiplying crop height, RF and PLSR to combine VIs and crop heights); and (iv) the spectral performance of the sensors is crucial in LAI estimates (the wheat LAI cannot be accurately estimated over multiple growing stages when using only crop height). The LAI estimates ranked from best to worst are ground spectrometer, UAV snapshot hyperspectral sensor, and UAV high-definition digital camera.

Graphical Abstract

1. Introduction

Crop parameters, such as leaf area index (LAI) and above-ground biomass (AGB) are crucial for accurately monitoring crop growth for agriculture management [1,2], and accurate estimates of crop variables can help improve crop monitoring and yield predictions [3,4]. Traditional methods to estimate crop parameters are based on destructive measurements, which not only are time and labor consuming but, more importantly, are also difficult to apply over large areas. Remote-sensing data acquired from the various platforms (ground, airborne, and satellite) can capture crop-canopy spectra and thereby provide information on the biochemical composition of the crop canopy [5,6,7,8], and remote-sensing technology is now used for estimating crop parameters such as LAI at different spatial resolutions [6], AGB [5,7], chlorophyll (Chl) [9,10], and the fraction of absorbed photosynthetically active radiation (fAPAR) [11,12].
Timely and accurate estimates of crop parameters are crucial for agriculture management. To achieve this goal, remote sensing via unmanned aerial vehicles (UAVs) has recently attracted the attention of many researchers because it yields remote-sensing images with higher temporal, spatial, and ground resolutions than are available from satellites [13,14,15,16]. Unmanned aerial vehicle remote sensing has numerous advantages compared with remote sensing from conventional aerial vehicles, including cost, weight, flight speed, and flight altitude. More importantly, however, UAVs have simpler requirements for takeoff and landing.
Crop-canopy spectral reflectance is determined by the parameters of the internal structure of leaves (i.e., leaf mesophyll, chlorophyll a and b concentrations, water content, dry matter content, mass per unit area, brown pigments, and total carotenoid content) and of the canopy structure (i.e., LAI, leaf-inclination distribution function, sun and sensor zenith angles, and soil background) [17,18,19,20]. Therefore, the crop parameters can be estimated from the spectral characteristics. However, redundancy and multicollinearity are two important aspects of hyperspectral data that cannot be ignored. Many studies on estimating crop parameters tend to use the characteristic spectra in the visible and near infrared blue (460 nm), green (560 nm), red (670 nm), near infrared (800 nm), and red-edge bands) [21,22,23].
A vegetation index (VI) is a combination of two or more characteristic spectra acquired by multispectral or hyperspectral remote-sensing techniques. It is a simple, effective, and empirical measure of the status of surface vegetation. Many VIs have been developed to estimate crop parameters, such as the normalized difference VI (NDVI) [21], the two-band enhanced VI (EVI2) [23], and the optimized soil regulation VI (OSAVI) [22]. However, many VIs tend to saturate when used to estimate crop parameters because they are insensitive at medium-to-high canopy cover [24]. Compared with using only optical remote-sensing information, several fruitful results have been obtained in numerous previous studies by using multi-source remote-sensing data. For example, Gao et al. [25] estimated the LAI, height, and biomass of maize by using single-temporal Huanjing-1A/B (spectral) and RADARSAT-2 (synthetic aperture radar, SAR). Jin et al. [24] estimated the LAI and biomass of winter wheat by using multi-temporal Huanjing-1A/B (spectral) and RADARSAT-2 (SAR). These studies indicate that best results for are obtained when spectral data and light detection and ranging (LiDAR) or SAR data are combined.
Unmanned aerial vehicles may be equipped with snapshot sensors, global navigation satellite systems, and inertial navigation systems, and the combined system can obtain geo-referenced images in various geographic locations. The corresponding field digital orthophoto maps (DOMs) and crop surface models (CSMs) [26] can be obtained by using digital photogrammetry technology. UAV-mounted remote sensing can obtain not only spectral data but, more importantly, vertical growth information on crops, such as crop height [27]. Crop height in turn may be used to estimate other crop parameters because it describes the vertical growth state of crops when traditional satellite optical remote sensing cannot [28].
Numerous studies have used UAV-based remote-sensing data to estimate crop parameters [14,27,29,30]; they focus mostly on only CSMs or RGB-based VIs obtained from high-definition digital cameras to estimate crop parameters. The objectives of this study are thus to evaluate the performance of different sensors by use of CSMs, VIs, and combinations thereof to make multi-temporal estimates and maps of crop parameters. More specifically, we evaluate (i) the performance of crop parameters estimates by using near-surface spectroscopy (350~2500 nm, 3 nm at 700 nm, 8.5 nm at 1400 nm, 6.5 nm at 2100 nm), a UAV-mounted snapshot hyperspectral sensor (450~950 nm, 8 nm at 532 nm) and a high-definition digital camera (R, G, B); (ii) the crop surface models (CSMs), RGB-based VIs, hyperspectral-based VIs, and methods combined therefrom to make multi-temporal estimates of crop parameters and to map the parameters. Moreover, the estimated LAI and AGB are obtained by using linear and exponential equations, random forest (RF) regression, and partial least squares regression (PLSR) to combine the UAV based spectral VIs and crop heights (from the CSMs). This paper is structured as follows:
(1)
Section 2 presents the field sampling and treatment, the UAV remote-sensing data-acquisition methods, and the methods used to generate DOMs and CSMs. It also discusses the selection of VIs, data analysis, estimation methods, and statistical analysis.
(2)
Section 3 presents the results and the precision of the estimates of crop parameters (i) when using only RGB-based VIs and hyperspectral-based VIs; (ii) when using only crop height; (iii) when combining the VIs and crop height; and (iv) when combining the spectral VIs and crop height by using random forest regression and partial least squares regression.
(3)
Section 4 analyzes and compares the advantages and disadvantages of the four approaches detailed in Section 3. The advantages and disadvantages of remote sensing by UAV-mounted snapshot hyperspectral sensors and high-definition digital cameras are also discussed.
(4)
Finally, we discuss potential applications in agriculture of UAV-mounted snapshot hyperspectral sensors and high-definition digital cameras.

2. Materials and Methods

2.1. Experiments

Experiments were conducted at the Xiao Tangshan National Precision Agriculture Research Center of China, which is located in the Changping District (115°50′17″–116°29′49″E, 40°2′18″–40°23′13″N) of Beijing, China. The territory of Changping District is flat, with an average altitude of 36 m, and it has a warm temperate semi-humid continental monsoon climate. Changping District has an average low temperature of −10–7.5 °C, an average high temperature of 35–40 °C, and an annual rainfall of 450 mm (from China Meteorological Data Service, http://data.cma.cn/). The study site has fine-loamy soil.
A total of three plant groups were selected for ground measurements during the winter wheat jointing, flagging, and flowering periods of 2015 (see Figure 1). Winter wheat was grown on 48 plots and different amounts of nitrogen fertilizer and irrigation levels were used, as shown in Figure 1. We applied two winter wheat varieties ((1) J9843 (human selected type being), Beijing Municipal Bureau of Agriculture, China, and (2) ZM175 (human selected type being), Institute of Crop Science, Chinese Academy of Agricultural Sciences, China), three water treatments (W0 is rainfall only, W1 is rainfall plus 100 mm, and W2 is rainfall plus 200 mm), and four nitrogen treatments (N0 is no fertilizer, N1 is 195 kg/ha, N2 is 390 kg/ha, and N3 is 780 kg/ha).
Three hyperspectral and high-definition digital images of winter wheat field were acquired by using a snapshot hyperspectral sensor and a high-definition digital camera mounted on a UAV. Ground hyperspectral data were collected by using a near-surface spectrometer. During the ground-measurement campaign, winter wheat LAI, AGB, and canopy height were also measured for each plot. Sensor parameters and data collection and processing methods are discussed in the following sections.

2.2. Ground Measurement of Crop Parameters

2.2.1. Measurement of Field Hyperspectral Reflectance

Ground winter wheat canopy reflectance was measured by using an ASD Field Spec 3 spectrometer (Analytical Spectral Devices, Boulder, CO, USA). ASD Field Spec 3 spectrometer has a full-range detection capacity (350 nm to 2500 nm, spectral resolution: 3 nm at 700 nm, 8.5 nm at 1400 nm, 6.5 nm at 2100 nm) and provides uniform visible near-infrared shortwave infrared data collection over the entire solar spectrum. Data was resampled to 1 nm spacing automatically. Measurements were collected during the jointing (21 April 2015), flagging (26 April 2015), and flowering (13 May 2015) stages of winter wheat growth. After the UAV flight (from 11.00 h a.m. to 2.00 h p.m.), ground measurements were carried out under windless conditions and stable light levels. We calibrated the field spectrometer based on the reflectance from a 40 cm × 40 cm BaSO4 whiteboard. Winter wheat canopy reflectance was measured 10 times at the center of each plot, and the average reflectance was recorded.

2.2.2. Measurement of Crop Height

The crop height was measured by using a straight measuring stick (crop height was from the ground to the highest part of the wheat plant). We measured three times at the center of each plot, and recorded the average reflectance. The sampling location was near the center of the plots, and 20 winter wheat plants growing close to the center were selected. After ground measurements, the winter wheat organs were processed in the laboratory.

2.2.3. Measurement of Leaf Area Index in Laboratory

In the laboratory, all leaves were stuck onto A4-size paper and the total area of each leaf was measured by using an optical scanner. The winter wheat LAI was calculated as:
LAI = S × n p × l
where S is the total area of all the leaves, n is the number of winter wheat ears per unit area, p is the number of select winter wheat plants (p = 20, in this study), and l is the row spacing (15 cm).

2.2.4. Measurement of Aboveground Biomass in Laboratory

After calculating the LAI, all the leaves and stems from each plot were put into paper bags together and dried at 80 °C to remove moisture. Once the sample weight became constant (about 24 h), they were weighed by using a balance with an accuracy of 0.001 g. Finally, the biomass per unit area was calculated based on the measured planting density and the dry weight of the sample. The winter wheat AGB was calculated by using:
  AGB = m × n p × l
where m is the dry weight of the sample, n is the number of winter wheat ears per unit area, p is the number of select winter wheat plants (p = 20, in this study), and l is row spacing.

2.3. Acquisition and Processing of Unmanned Aerial Vehicle Remote-Sensing Images

2.3.1. Unmanned Aerial Vehicle, Snapshot Hyperspectral Sensor, and High-Definition Digital Camera

As UAV sensor platform, we used a DJI S1000 UAV (SZ DJI Technology Co., Ltd., Sham Chun, China) with eight propellers, which is very stable at low flight speed and low altitude. When equipped with two 18,000 mAh (25 V) batteries, it runs for 30 min with a takeoff weight of 6 kg, a flying altitude of 50 m, and a flight speed of 8 m/s.
The Sony DSC–QX100 (Sony DSC–QX100, Sony, Tokyo, Japan) is a high-definition digital camera (Table 1) with a short exposure and integration time. It weighs 0.18 kg and measures 63 × 63 × 56 mm3. It provides digital number (DN) values in the visible (spectral: R, G, B).
The UHD 185 Firefly (UHD 185 firefly, Cubert GmbH, Ulm, Baden-Württemberg, Germany) is a snapshot hyperspectral sensor (Table 1). The UHD 185 has a short exposure and integration time, weighs 0.47 kg, and measures 195 × 67 × 60 mm3. Its operating range spans from the visible to the near-infrared (wavelength range: 450 nm to 950 nm, 8 nm @ 532 nm, Table 1). Hyperspectral data cubes were automatically resampled to 4 nm spacing. Collected radiation is recorded as a 1000 × 1000 (1 band) panchromatic image and a 50 × 50 (125 bands) hyperspectral cube. Because the acquired panchromatic images are rich in texture information, and stitching is relatively simple; and the 50 × 50 (125 bands) hyperspectral cubes are characterized by rich spectral information, but lack texture information, hyperspectral image fusion was used to splice the UHD 185 hyperspectral images. The fusion steps are implemented in Cubert Cube-Pilot software (Cube-Pilot, Version 1.4, Cubert GmbH, Ulm, Baden-Württemberg, Germany). After fusion, all hyperspectral images with 1000 × 1000 (125 bands) were stitched together using an image stitching process.

2.3.2. Radiometric Calibration

Flights were conducted during the jointing (21 April 2015), flagging (26 April 2015), and flowering (13 May 2015) stages of winter wheat, with the UAV carrying a UHD 185 snapshot hyperspectral sensor (abbreviated as UAV-UHD) and a Sony DSC–QX100 high-definition digital camera (abbreviated as UAV-DC). The UAV-UHD and UAV-DC images were obtained at 50 m altitude and during stable light conditions, so atmospheric correction was not required. The UHD 185 and Sony DSC–QX100 exposure times were fixed depending on the intensity of sun light.
The UHD 185 was calibrated on the ground before the UAV flight by using Cubert Cube-Pilot software (Version 1.4) and a BaSO4 whiteboard. The original DN values of the UAV-DC images were calibrated by imaging a black-and-white fabric placed on the ground and using
  D N i = D N o r i g i n a l D N b l a c k D N w h i t e D N b l a c k × 255 ,
where DNi is the band names, such as R, G, B, DNoriginal is the original DN value of the high-definition digital camera images; and DNwhite and DNblack are the original DN values from the white-and-black fabric in the UAV-DC images (see Figure 1).

2.3.3. Generating Digital Orthophotos Maps and Crop Surface Models

Snapshot hyperspectral and high-definition digital images were mosaicked together to obtain a panoramic hyperspectral and digital orthophotos maps (DOM) for an entire area. This is typically accomplished by using structure-from-motion photogrammetry, which is a process for extracting geometric structures from camera images taken from different camera stations. A digital surface model (DSM) is a simulation of a terrain’s surface in which is recorded information about the field and crop. The main steps to generate a DOM and DSM from hyperspectral and high-definition digital images are as follows:
(1)
Extract feature points by using the scale-invariant feature transform algorithm
(2)
Match features
(3)
Apply the structure-from-motion algorithm and a bundle-block adjustment to recover the image poses and build sparse three-dimensional (3D) feature point
(4)
Build dense 3D point clouds from camera poses estimated from Step (3) and sparse 3D feature points by using the multi-view stereo algorithm
(5)
Build a 3D polygonal mesh of the object surface based on the dense cloud.
These steps are implemented in PhotoScan from Agisoft (Agisoft PhotoScan Professional Pro, Version 1.1.6, Agisoft LLC, 11 Degtyarniy per., St. Petersburg, Russia, hereinafter referred to as PhotoScan). Executing these steps produces a DOM and the corresponding DSM of each flight.
The CSM is generated by subtracting the digital elevation model (DEM) from DSM (DEM represents the variations of field soils under crops). The main steps to generate DEMs, CSMs, and UAV-based crop height from DSMs are as follows:
(1)
In total of 279 soil point coordinates were recorded by using the ARCGIS software (ARCGIS, Environmental Systems Research Institute, Inc., Redlands, CA, USA) and DSM, and then using the ARCGIS extract tools and DSM to obtain the elevation of each soil point.
(2)
The field DEM is obtained by interpolating the surface elevation between soil points by using the ARCGIS kriging tools.
(3)
The CSM of winter wheat is calculated by using the ARCGIS raster calculator tools.
(4)
The crop height of each plot is obtained by using the ARCGIS ROI tools.
Figure 2 and Figure 3 show the UAV-UHD hyperspectral images, high-definition digital camera images and corresponding crop height maps of three plant groups for three growing periods. The increase in canopy coverage appear clearly in Figure 2 (from Figure 2a,c,e) and Figure 3 (from Figure 3a,c,e). Meanwhile, crop height maps (from Figure 2b,d,f; from Figure 3b,d,f) also agree that the crop height increasing.

2.4. Data Analysis and Estimation Methods

2.4.1. Selection of Vegetation Indices

To analyze the relationships between vegetation indices and LAI and biomass, this study uses digital camera DN values (R, G, B, r, g, b, B/R, B/G, R/G), VIs [excess red (EXR) [31], the visible atmospherically resistant index (VARI) [32], the green-red vegetation index (GRVI) [31], and hyperspectral VIs [bare ground index (BGI) [33], the NDVI [21], the linear combination index (LCI) [34], the normalized pigment chlorophyll ratio Index (NPCI) [35], the two-band enhanced vegetation index (EVI2) [23], optimized soil adjusted vegetation index (OSAVI) [22], the spectral polygon vegetation index (SPVI) [36], the modified chlorophyll absorption reflectivity index (MCARI) [37]). These are shown in Table 2.

2.4.2. Statistic Regression Methods

We used linear and exponential equations to estimate LAI and AGB. These equations are given below. We used Equations (3) and (4) to evaluate single-spectral information or CSM information in crop-parameter estimation where x represents a VI or the crop height and y represents the crop parameters. Equations (5) and (6) were used to evaluate both spectral and CSM information to estimate crop parameters, where y represents crop parameters, x1 represents crop height, and x2 represents a VI.
  y = a × x + b
  y = a × e x + b
  y = a × x 1 × x 2 + b
  y = a × e x 1 × x 2 + c
  y = a × x 1 / x 2 + c
  y = a × e x 1 / x 2 + c
Note: Equations (4) and (5) can only be used for vegetation indices (Vis) or crop surface model (CSM); Equations (6)–(9) can use both VIs and crop height and is abbreviated as “height × VI” method. x : VIs or Crop height/CSM information; y : crop parameters. x 1 : Crop height/CSM information; x 2 : VIs information; y : crop parameters.
Because the linear and exponential equations cannot solve multicollinearity problems, the models (Equations (4)–(9)) in this study can only accept a few variables. Previous studies show that RF and PLSR are powerful for dealing with multicollinearity problems, so we also used RF and PLSR. PLSR is a data-analysis method proposed by Wold [38] and has been widely used in studies involving remote sensing in agricultural because it offers powerful empirical regression techniques that make full use of input data to monitor agricultural crop parameters. Random forest is a data analysis and statistical method for classification and regression proposed by Breiman and Cutler [39]. In recent years, RF has been widely used in machine-learning research. It operates by constructing a multitude of decision trees at training time and outputting the class that is the mode of the classes (classification) or the mean prediction (regression) of the individual trees. RF has a higher accuracy, better tolerance to outliers and noise, and makes excellent use of the input data.
Above-ground biomass and leaf area index estimation models (RF and PLSR) calibration and validation were done by using MATLAB software. The parameters of RF were optimized by using the method from Yue et al. [7]. The tree number was set to 800, and parameter mtry was set to 6 for the entire growth periods.

2.4.3. Statistical Analysis

With three measurements, we acquired a total of 144 datasets. Two groups of data (Groups 2 and 3, in Figure 1, used as the calibration set) were used to analyze the relationship between spectral, DSM, and combined information and AGB and LAI. Group 1 was used as the validation set. After data analysis, we used 96 sets of data to build the models and 48 setsdatas to validate the models. In this study, the correlation analysis was done by using the statistical software R 3.5.0 and calibration set.
The coefficient of determination R2, root mean square error (RMSE), normalized root mean square error (nRMSE), and mean absolute error (MAE) were used to evaluate the performance of each model. Mathematically, a higher R2 corresponds to a smaller RMSE, nRMSE and MAE, and thus represents better model accuracy. The following equations were used to calculate R2, RMSE, nRMSE and MAE:
R 2 = 1 i = 1 n ( y i x i ) 2 i = 1 n ( y i y ¯ ) 2 ,
RMSE = i = 1 n ( x i y i ) 2 n ,
MAE = i = 1 n | x i y i | n ,
nRMSE = R M S E y ¯ ,
where xi and yi are the estimated and measured values, respectively x ¯ and y ¯ are the average estimated and measured values, respectively and n is the sample number.

3. Results and Analysis

Table 3 shows the statistics of AGB (t/ha), LAI (m2/m2), and crop height (cm) measurements for three growing periods. A calibration set with 96 samples and a validation set with 48 samples cover three stages.

3.1. Spectral and Crop Height

Figure 4a,b shows the mean hyperspectral reflectance spectra of the winter wheat canopy at three growing stages. The hyperspectral reflectance in the near-infrared bands (about 800 to 1000 nm) first increases and then decreases as winter wheat grows. However, this trend does not appear in the data acquired by the high-definition digital camera (Figure 4c) calibrated by DN values because they do not cover the near-infrared bands. As winter wheat grows, the crop height increases. The ground-measured crop height (Figure 4d) exceeds the UAV-measured crop height (Figure 4e,f). However, the two results for UAV-measured crop heights are similar (Figure 4e,f).
Figure 5 shows the Pearson correlation coefficient between AGB, LAI, VI and crop height. For each VI, crop height, AGB and LAI, a square of varying size and color represents the corresponding Pearson correlation coefficient. The Pearson correlation coefficient indicates that a positive correlation exists between crop height, SPVI, LCI, NDVI, OSAVI, EVI2, and BGI (Figure 5a). The results also indicate that a negative correlation exists between crop height, MCARI, and NPCI (Figure 5b). For VIs built from high-definition digital images, crop height correlates positively to r, R/G, and EXR, but negatively to b, B/R, B/G, VARI and GRVI (Figure 5c). The results in Figure 5 indicate that the measured VIs are correlated to varying degrees with AGB and LAI. Crop heights (H) are positively correlate with AGB but uncorrelated with LAI in three growing stages. VIs are highly correlated with LAI in three growing stages, for example, NDVI, LCI, OSAVI, etc. Most red and near-infrared-based spectral VIs have a low correlation with AGB in three growing stages; for example, NDVI, LCI, NPCI, OSAVI, etc. However, the R, G, B based VIs are highly correlated with AGB; for example, b, r, VARI. Sensitivity analyses indicated that multi-temporal spectral VIs based AGB estimation may inefficiently. Single- and multi-temporal estimates of AGB and LAI estimation are analyzed in the next section.

3.2. Relationships between (i) Vegetation Indices, Crop Height, (ii) Leaf Area Index and Above-Ground Biomass

3.2.1. Relationships between Vegetation Indices and Leaf Area Index and Above-Ground Biomass

We used linear and nonlinear regression analysis to investigate VIs with Equations (4) and (5). The best VIs and equations (Table 4) were selected based on growing stages and remote-sensing data (G-VIs, DC-VIs, and UHD-VIs). Good relationships between VIs and LAI and AGB were obtained and the regression equations of LAI and AGB were built for winter wheat (see Table 4).
The regression equations for LAI and AGB were validated using the calibration datasets, and the resulting R2, MAE, and RMSE are shown in Table 4 and in Figure 6a–c and Figure 7a–c. For the LAI estimated for three growth stages, the RMSEs range from 0.88 to 1.01, the MAEs range from 0.67 to 0.80, and R2 ranges from 0.46 to 0.59. For AGB estimated over the three growth stages, the RMSEs range from 1.71 to 2.25 t/ha, the MAEs range from 1.19 to 1.88 t/ha, and R2 ranges from 0.26 to 0.67.
All G-VIs, DC-VIs, and UHD-VIs give good results for a single growing stage. However, the precision decreases when data from all three growing stages are used in the modeling. The results show that G-VIs are effective for estimating LAI and AGB in the jointing and flagging stages, but DC-VIs are more effective for estimating LAI and AGB in the flowering stage and over all three stages. The results show that the best G-VIs for estimating LAI or AGB vary over the different growing stages. However, the best G-VI for estimating LAI and AGB give the same results for both single growth stages and over all three growth stages.
Figure 6a–c shows the relationship between best VIs and AGB, and the associated validation accuracy of the three sensors. Figure 7a–c shows the relationship between best VIs and LAI, and the associated validation accuracy of the three sensors. LCI (Table 4 and Figure 6a,b) performs best in estimating LAI and AGB among eight investigated VIs. However, the relationship between LCI and LAI are much better than that of AGB (Figure 7a,b). LCI have a similar range in multi-temporal when AGB keeps increasing (Figure 6a,b). During reproductive growth (flowering), photosynthetic products are mostly stored in reproductive organs (e.g., flower and ear of wheat), but are mostly stored in stems and leaves during vegetative growth. Thus, it is difficult to monitor AGB by only use spectral VIs (Table 4 and Figure 6a,b).

3.2.2. Relationships between (i) Crop Height and (ii) Leaf Area Index and Above-Ground Biomass

Data for three single growing stages and all three growing stages were used to evaluate the relationships between crop height and LAI and AGB. We applied a linear and nonlinear regression analysis to investigate VIs by using Equations (4)–(9). The results indicate a low correlation between crop height and LAI and AGB for single growing stages, especially for the jointing stage. A good correlation exists between crop height and AGB when using data from the three growing stages (see Table 5 and Figure 6d–f). However, because of low relativity, it is difficult to directly use crop height to estimate LAI when the data cover multiple growing periods (see Table 5 and Figure 7d–f).
The regression equations for LAI and AGB are validated by using the calibration datasets, and the resulting R2, MAE, and RMSE are shown in Table 5 and Figure 6 and Figure 7. To estimate AGB over all three stages, the results show that the RMSEs range from 1.00 to 1.08 t/ha, the MAEs range from 1.39 to 1.49 t/ha, and R2 ranges from 0.71 to 0.75.
Figure 6d–f shows the relationship between crop height and AGB, and the associated validation accuracy of the three sensors. The results (Figure 6d–f) indicate that crop height keeps increasing with AGB in multi-temporal mode. Thus, crop height can be used for multi-temporal AGB estimation. More importantly, crop height (Table 5) performs better in estimating AGB among all investigated VIs (Table 4). Figure 7d–f shows the relationship between crop height and LAI, and the associated validation accuracy of the three sensors. The results (Figure 7d–f and Table 5) indicate that crop height is weakly correlated with LAI, especially in multi-temporal mode.

3.2.3. Relationships between Crop Height and Vegetation Indices and Leaf Area Index and Above-Ground Biomass

We apply a linear and nonlinear regression analysis to investigate crop height and selected VIs by using Equations (6)–(9). The selected VIs are obtained from Table 4, and the equations are given in Table 6.
The regression equations for LAI and AGB are validated using the calibration datasets, and the resulting R2, MAE, and RMSE are given in Table 6 and in Figure 6g–i and Figure 7g–i. The results show that the combined VIs and crop height methods cannot improve the precision of the estimate of LAI (Table 6) and that, furthermore, the crop height determined by combining ASD-based LCI is more effective for estimating AGB over all three growing stages with ground validation R2 = 0.26~0.81, MAE = 1.88 to 0.99, and RMSE = 2.25 to 1.32. In addition, for UHD-height validation, R2 = 0.30~0.74, MAE = 1.84 to 1.05, and RMSE = 2.18 to 1.38. The results of the UAV-DC method also improved, with R2 = 0.67 to 0.77, MAE = 1.19 to 1.02, and RMSE = 1.71 to 1.30.
In this study, VIs from three sensors (Table 1) were individually combined with crop height to analyze the relationships of height × VIs with LAI and AGB. The relationship between crop height × VIs and AGB of the three sensors are shown in Figure 6g–i. The results (Figure 6g–i and Table 6) indicate that AGB estimation accuracy can be improved by multiplying crop height by a VI. Thus, the height × VI regression equation for estimating AGB is fit to power regression equations. To estimate LAI, we find that crop height × VI is weakly correlated with LAI in multi-temporal estimation (Figure 7g–i and Table 6). In addition, the height × VI based LAI estimation accuracies is also lower than that obtained by only using VIs in single stages.

3.3. Using Crop Height and Vegetation Indices with Partial Least Square Regressionand Random Forest Regression to Estimate Leaf Area Index and Above-Ground Biomass

The RF and PLSR methods were used to combine the VIs and crop height to estimate the LAI and AGB of winter wheat. Table 2 shows the winter wheat AGB and LAI obtained from the RF and PLSR regression techniques based on all VIs. Table 7 shows the estimates of AGB and LAI based on VIs and using the RF and PLSR methods. The VIs based on AGB modeling have the best R2 values of 0.93 (RF: MAE = 0.52, RMSE = 0.72, nRMSE = 13.22%) and 0.61 (PLSR: MAE = 1.19, RMSE = 1.62, nRMSE = 29.76%). The VIs based on LAI modeling have R2 = 0.94 (RF: MAE = 0.26, RMSE = 0.33, nRMSE = 8.75%) and 0.58 (PLSR: MAE = 0.64, RMSE = 0.86, nRMSE = 22.81%). Table A1 ranks the predictor variables according to their importance in AGB and LAI estimation.
Table 8 shows AGB and LAI estimates based on crop height and VIs and using the RF and PLSR methods. The results indicate that the AGB estimate can be improved by using crop height, Vis, and the RF and PLSR methods. The best R2 values are 0.96 (RF: MAE = 0.40, RMSE = 0.57, nRMSE = 10.47%) and 0.81 (PLSR: MAE = 0.83, RMSE = 1.12, nRMSE = 20.58%). However, the accuracy estimates of LAI based on crop height and VIs remains almost constant, with R2 values of 0.95 (RF: MAE = 0.24, RMSE = 0.31, nRMSE = 8.22%) and 0.65 (PLSR: MAE = 0.56, RMSE = 0.77, nRMSE = 20.42%).
We used the validation dataset to validate the crop height, VIs and PLSR and RF methods for estimating LAI and AGB. Figure 8 and Figure 9 show the relationships (fitting formula: y = ax + b, R2, MAE, and RMSE) between the predicted and measured winter wheat AGB and LAI. The results show that the methods based on crop height and VIs provide better estimates of AGB than those based only on VIs (Figure 8). More importantly, all results based on crop height and VIs fit better (their slopes a are closer to unity; see Figure 8). As is the case for the LAI modeling dataset, the relationship in Figure 9 indicates that no notable difference exists between methods based on crop height and VIs and those based only on VIs.
The results thus indicate that the PLSR and RF techniques could be used to improve the accuracy of estimates of AGB and LAI (Table 7 and Table 8) compared with that available with single VIs (Table 4), crop height (Table 5), or simple combinations thereof (Table 6).

3.4. Mapping Leaf Area Index and Above-Ground Biomass

We estimate the spatial distribution of AGB (Figure 10) based on all UHD and DC VIs, crop height, and using PLSR. The spatial distributions of UHD-based AGB maps (Figure 10a–c) are similar to those of UHV-DC (Figure 10d–f). An increase in AGB occurred from 21 April to 13 May 2015 (Figure 10). The averaged UHD-based AGB maps range from 3.05 t/ha (21 April) to 5.84 t/ha (26 April) to 7.86 t/ha (13 May). The average DC-based AGB maps range from 3.30 t/ha (21 April) to 6.49 t/ha (26 April) to 8.46 t/ha (13 May). The present AGB maps indicate that the AGB of most plant plots is less than 6 t/ha on 21 April; however, most plots exceed 6 t/ha on 26 April, and almost all plant plots have AGB >6 t/ha on 13 May. These mapping results are consistent with our field observations (Table 3) and with precision evaluation (Figure 8).
The spatial distribution of the LAI (Figure 11) is estimated based on all UHD and DC VIs and by using PLSR. The UHD-based LAI maps are shown in Figure 11a–c, when DC-based LAI maps are shown in Figure 11d–f. From 21 April to 26, both UHD (from 2.95 to 3.35) and DC (from 3.06 to 3.54) LAI maps show that the LAI increased in all plots. However, the averaged LAI decreased from 26 April to 13 May (UHD map: 3.07, DC map: 2.91). These results are consistent with our field observations (Table 3) and with precision evaluation (Figure 9).

4. Discussion

4.1. Leaf Area Index and Above-Ground Biomass Estimation Using Vegetation Indices

From 21 April to 26 April, the averaged LAI increased in all plots but decreased from 26 April to 13 May (Table 3 and Figure 10 and Figure 11). In the current study, both UAV-UHD and G-ASD mean hyperspectral reflectance spectra of winter wheat canopy first increased and then decrease (Figure 4), which is consistent with field measurements of LAI. The present results suggest that the LAI can be accurately estimated using single VIs in specific, single growing stages and multi- growing stages (Table 4). Our results suggest that the AGB can be accurately estimated by using single VIs in single growing stages (Table 4). However, we also found AGB changes (Table 3 and Figure 10 and Figure 11) do not match the spectral changes (Figure 4a,b) in multi-temporal mode. During reproductive growth (flowering), photosynthetic products are mostly stored in reproductive organs (e.g., flower and ear of wheat), but were mostly stored in stems and leaves during vegetative growth (Jointing and Flagging) [40]. Therefore, it is difficult to use the traditional VIs based on red and near-infrared bands to estimate multi-temporal AGB.
In addition, many VIs saturate when used to estimate crop parameters because they are not very sensitive for medium-to-high canopy cover [24,41]. From the jointing stage to the flagging and flowering stages, the winter wheat canopy coverage increases continuously, and estimates of the LAI and AGB face different canopy coverage. The changes of VIs for estimating winter wheat LAI and AGB are correlated with the dry matter, pigment, and water content [18,42]. This means the sensitivity of VIs depends on the growing stage of winter wheat. Therefore, identifying the optimized predictor variables is crucial for estimating crop parameters based on remote-sensing data, and also helps us select the suitable sensors.

4.2. Leaf Area Index and Above-Ground Biomass Estimation Using Vegetation Indices and Crop Height

Crop height can be used to estimate AGB at medium-to-high canopy coverage because crop height describes the vertical-growth state [26,43]. The present results indicate that the accuracy of AGB estimates based on crop height for single growing stages is worse than that based on spectral VIs. However, the overall modeling accuracy is greatly improved. This shows that crop vertical growth has a greater effect for estimating AGB over multiple growing stages. In the current study, the VIs and crop height are simply multiplied to estimate the AGB of winter wheat. The combined height and VIs provides better estimates of winter wheat AGB than do single VIs or crop height (Table 4, Table 5 and Table 6). Possoch et al. [30] and Yue et al. [44] reported that the drawbacks of crop height and VIs can be diminished when they are combined for modeling. The better performance of height × VI for estimating AGB over multiple growing stages is attributed to the capacity of crop height to describe vertical growth at medium-to-high canopy cover. Furthermore, the values predicted when using only spectral VIs are underestimated when AGB >6 t/ha (Figure 8c,d,g,h,k,l). This is because the spectral VIs lose sensitivity for estimating AGB in the later winter wheat growing stages when building estimation models based on multiple growing stages [44,45]. The addition of crop height makes up for the shortcomings of only spectral VIs. In addition, the advantages of spectral VIs and CSMs combine to further improve their value for applications, which will enhance the development of snapshot-sensor technology. This is because CSMs can be generated from inexpensive snapshot-sensor images.
The present results also indicate that height × VIs does not provide reasonable estimates of winter wheat LAI over multiple growing stages. From jointing to flagging and flowering growing stages, our field investigation indicates that the LAI increases and then decreases as the crop height increases (Table 3). Therefore, crop height (CSM) correlates poorly with the LAI for models spanning all three winter wheat growing stages. The present study uses the RF and PLSR to combine spectral VI and CSM remote-sensing data for estimating crop parameters and mapping the spatial distribution of winter wheat AGB and LAI. The result suggests that combining multiple types of remote sensing data (spectral and CSM data) using RF or PLSR is appropriate for estimating winter wheat AGB and LAI. The accuracy of the estimates of LAI and AGB is better when using RF and PLSR (Table 7 and Table 8) than when using linear and exponential estimates (Table 6). These results are consistent with those of Refs. [46,47,48], which suggest that estimation accuracy can be improved with PLSR and RF. The present results also prove that RF and PLSR can be used to combine multiple types of remote-sensing data (spectral and CSM data) to estimate AGB of winter wheat.
Compared with validation results using modeling (Table 7 and Table 8) and a calibration dataset (Figure 8 and Figure 9), PLSR provides better stability than RF. Overfitting may influence RF estimates of AGB and LAI for a small number of samples. Therefore, we use PLSR to map AGB and LAI.

4.3. Leaf Area Index and Above-Ground BiomassEstimation Performance of Different Sensors

The present results indicate that AGB can be accurately estimated by combining crop height and VIs generated from UAV-based snapshot hyperspectral sensors and high-definition digital cameras. In addition, no significant difference appears in the estimates of LAI when using only VIs or when combining VIs and crop height. The present results indicate that LAI can be properly estimated over multiple growing stages when using hyperspectral VIs and RF or PLSR. Ranking LAI estimates from best to worst gives ground-spectrometer, UAV snapshot hyperspectral sensor, and UAV high-definition digital camera. This ranking matches the spectral performance of the three sensors; thus the spectral performance of sensors may be crucial for estimating LAI over multiple growing stages. These results suggest that winter wheat LAI cannot be properly estimated using only CSMs (crop height) for multiple growing stages in Changping District, Beijing, China. Future studies are needed to verify these results for different crops and different locations.

5. Conclusions

This study compared ground spectrometer data with remote-sensing data from UAV-based snapshot hyperspectral sensors and high-definition digital cameras for estimating winter wheat LAI and AGB. Leaf area index and above-ground biomass were estimated by combining the spectral VIs and crop height (CSMs) via linear and exponential equations, RF, and PLSR.
(i)
We found that the correlation between the VIs × height (G-height × LCI, DC-height × r and UHD-height × LCI) and AGB is much greater than that when using a single VI or just the crop height. The results indicate that the combined use of VIs and crop height (CSMs) provides more accurate estimates of winter wheat AGB.
(ii)
The combined methods (VIs × height) to estimate AGB from UAV-mounted snapshot hyperspectral sensors and high-definition digital cameras provide an accuracy similar to that obtained when using a ground spectrometer to collect the data.
(iii)
The winter wheat LAI cannot be properly estimated using crop height over three growing stages. The results suggest that crop height is not a key variable for estimating wheat LAI when using remote-sensing data from all three growing stages. Therefore, the spectral performance of the sensors is crucial for estimating LAI over multiple growing stages. Ranking the LAI estimates from most to least accurate gives ground spectrometer, UAV snapshot hyperspectral sensor, UAV high-definition digital camera.

Author Contributions

J.Y., H.F. analyzed the data and wrote the manuscript. X.J., H.Y., Z.L., C.Z. and G.Y. collected the UAV images. Z.L., J.Y. and C.Z. collected the ground AGB, LAI and ASD hyperspectral data. X.J., Z.L. and Q.T. provided comments and suggestions for the manuscript and checked the writing.

Funding

This study was supported by the National Key Research and Development Program (2016YFD0300603), the Natural Science Foundation of China (41601346, 41601369, 41501481, 61661136003, 41771370, 41471285, 41471351).

Acknowledgments

Thanks to Hong Chang and Weiguo Li for the field data collection and farmland management. Thanks to all employees of Xiao Tangshan National Precision Agriculture Research Center.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

In RF estimation, RF models also provide the importance of the predictor variables. Table A1 shows the predictor variables ranking according to its importance in AGB and LAI estimation.
Table A1. Ranking of predictor variables according to its importance in AGB and LAI estimation (RF).
Table A1. Ranking of predictor variables according to its importance in AGB and LAI estimation (RF).
RankAGBLAI
GroundUAV-UHDUAV-DCGroundUAV-UHDUAV-DC
H, VIsVIsH, VIsVIsH, VIsVIsH, VIsVIsH, VIsVIsH, VIsVIs
1HBGIHBGIrrNDVINDVINDVINDVIBB
2BGILCIBGILCIHVARILCINPCILCINPCIB/GB/G
3LCINPCILCINPCIVARIB/RNPCILCINPCILCIGG
4MCARISPVIMCARISPVIB/REXROSAVIOSAVIBGIBGIbb
5NPCIMCARINPCIMCARIR/GGRVIMCARIMCARIHOSAVIB/RB/R
6SPVIEVI2EVI2EVI2GRVIR/GHEVI2OSAVISPVIRR
7EVI2OSAVISPVIOSAVIEXREXGEVI2SPVISPVIEVI2Hr
8OSAVINDVINDVINDVIbGLASPVIBGIEVI2MCARIrEXG
9NDVI OSAVI B/GBBGI MCARI EXGVARI
10 GLAG GLAGLA
11 EXGB/G VARIR/G
12 Gg gEXR
13 Bb EXRg
14 RR R/GGRVI
15 g GRVI

References

  1. Li, X.; Zhang, Y.; Luo, J.; Jin, X.; Xu, Y.; Yang, W. Quantification winter wheat LAI with HJ-1CCD image features over multiple growing seasons. Int. J. Appl. Earth Obs. Geoinf. 2016, 44, 104–112. [Google Scholar] [CrossRef]
  2. Liu, B.; Asseng, S.; Wang, A.; Wang, S.; Tang, L.; Cao, W.; Zhu, Y.; Liu, L. Modelling the effects of post-heading heat stress on biomass growth of winter wheat. Agric. For. Meteorol. 2017, 247, 476–490. [Google Scholar] [CrossRef]
  3. Launay, M.; Guerif, M. Assimilating remote sensing data into a crop model to improve predictive performance for spatial applications. Agric. Ecosyst. Environ. 2005, 111, 321–339. [Google Scholar] [CrossRef]
  4. Huang, J.; Sedano, F.; Huang, Y.; Ma, H.; Li, X.; Liang, S.; Tian, L.; Zhang, X.; Fan, J.; Wu, W. Assimilating a synthetic Kalman filter leaf area index series into the WOFOST model to improve regional winter wheat yield estimation. Agric. For. Meteorol. 2016, 216, 188–202. [Google Scholar] [CrossRef]
  5. Cho, M.A.; Skidmore, A.; Corsi, F.; van Wieren, S.E.; Sobhan, I. Estimation of green grass/herb biomass from airborne hyperspectral imagery using spectral indices and partial least squares regression. Int. J. Appl. Earth Obs. Geoinf. 2007, 9, 414–424. [Google Scholar] [CrossRef]
  6. Atzberger, C.; Darvishzadeh, R.; Immitzer, M.; Schlerf, M.; Skidmore, A.; le Maire, G. Comparative analysis of different retrieval methods for mapping grassland leaf area index using airborne imaging spectroscopy. Int. J. Appl. Earth Obs. Geoinf. 2015, 43, 19–31. [Google Scholar] [CrossRef] [Green Version]
  7. Yue, J.; Feng, H.; Yang, G.; Li, Z. A comparison of regression techniques for estimation of above-ground winter wheat biomass using near-surface spectroscopy. Remote Sens. 2018, 10. [Google Scholar] [CrossRef]
  8. Galvão, L.S.; Formaggio, A.R.; Tisot, D.A. Discrimination of sugarcane varieties in Southeastern Brazil with EO-1 Hyperion data. Remote Sens. Environ. 2005, 94, 523–534. [Google Scholar] [CrossRef]
  9. Mishra, S.; Mishra, D.R. Normalized difference chlorophyll index: A novel model for remote estimation of chlorophyll-a concentration in turbid productive waters. Remote Sens. Environ. 2012, 117, 394–406. [Google Scholar] [CrossRef]
  10. Meroni, M.; Rossini, M.; Guanter, L.; Alonso, L.; Rascher, U.; Colombo, R.; Moreno, J. Remote sensing of solar-induced chlorophyll fluorescence: Review of methods and applications. Remote Sens. Environ. 2009, 113, 2037–2051. [Google Scholar] [CrossRef]
  11. Bacour, C.; Baret, F.; Béal, D.; Weiss, M.; Pavageau, K. Neural network estimation of LAI, fAPAR, fCover and LAI×Cab, from top of canopy MERIS reflectance data: Principles and validation. Remote Sens. Environ. 2006, 105, 313–325. [Google Scholar] [CrossRef]
  12. Li, W.; Weiss, M.; Waldner, F.; Defourny, P.; Demarez, V.; Morin, D.; Hagolle, O.; Baret, F. A generic algorithm to estimate LAI, FAPAR and FCOVER variables from SPOT4_HRVIR and landsat sensors: Evaluation of the consistency and comparison with ground measurements. Remote Sens. 2015, 7, 15494–15516. [Google Scholar] [CrossRef]
  13. Maimaitijiang, M.; Ghulam, A.; Sidike, P.; Hartling, S.; Maimaitiyiming, M.; Peterson, K.; Shavers, E.; Fishman, J.; Peterson, J.; Kadam, S.; et al. Unmanned Aerial System (UAS)-based phenotyping of soybean using multi-sensor data fusion and extreme learning machine. ISPRS J. Photogramm. Remote Sens. 2017, 134, 43–58. [Google Scholar] [CrossRef]
  14. Zhou, X.; Zheng, H.B.; Xu, X.Q.; He, J.Y.; Ge, X.K.; Yao, X.; Cheng, T.; Zhu, Y.; Cao, W.X.; Tian, Y.C. Predicting grain yield in rice using multi-temporal vegetation indices from UAV-based multispectral and digital imagery. ISPRS J. Photogramm. Remote Sens. 2017, 130, 246–255. [Google Scholar] [CrossRef]
  15. Jin, X.; Liu, S.; Baret, F.; Hemerlé, M.; Comar, A. Estimates of plant density of wheat crops at emergence from very low altitude UAV imagery. Remote Sens. Environ. 2017, 198, 105–114. [Google Scholar] [CrossRef]
  16. Yu, N.; Li, L.; Schmitz, N.; Tian, L.F.; Greenberg, J.A.; Diers, B.W. Development of methods to improve soybean yield estimation and predict plant maturity with an unmanned aerial vehicle based platform. Remote Sens. Environ. 2016, 187, 91–101. [Google Scholar] [CrossRef]
  17. Taniguchi, K.; Obata, K.; Yoshioka, H. Derivation and approximation of soil isoline equations in the red–near-infrared reflectance subspace. J. Appl. Remote Sens. 2014, 8, 083621. [Google Scholar] [CrossRef] [Green Version]
  18. Berger, K.; Atzberger, C.; Danner, M.; D’Urso, G.; Mauser, W.; Vuolo, F.; Hank, T. Evaluation of the PROSAIL model capabilities for future hyperspectral model environments: A review study. Remote Sens. 2018, 10. [Google Scholar] [CrossRef]
  19. Koetz, B.; Baret, F.; Poilvé, H.; Hill, J. Use of coupled canopy structure dynamic and radiative transfer models to estimate biophysical canopy characteristics. Remote Sens. Environ. 2005, 95, 115–124. [Google Scholar] [CrossRef]
  20. Broge, N.H.; Leblanc, E. Comparing prediction power and stability of broadband and hyperspectral vegetation indices for estimation of green leaf area index and canopy chlorophyll density. Remote Sens. Environ. 2001, 76, 156–172. [Google Scholar] [CrossRef]
  21. Rouse, J.W.; Haas, R.H.; Schell, J.A.; Deering, D.W. Monitoring Vegetation Systems in the Great Okains with ERTS. Third Earth Resour. Technol. Satell. Symp. 1973, 1, 325–333. [Google Scholar]
  22. Rondeaux, G.; Steven, M.; Baret, F. Optimization of soil-adjusted vegetation indices. Remote Sens. Environ. 1996, 55, 95–107. [Google Scholar] [CrossRef]
  23. Jiang, Z.; Huete, A.R.; Didan, K.; Miura, T. Development of a two-band enhanced vegetation index without a blue band. Remote Sens. Environ. 2008, 112, 3833–3845. [Google Scholar] [CrossRef]
  24. Jin, X.; Yang, G.; Xu, X.; Yang, H.; Feng, H.; Li, Z.; Shen, J.; Zhao, C.; Lan, Y. Combined multi-temporal optical and radar parameters for estimating LAI and biomass in winter wheat using HJ and RADARSAR-2 data. Remote Sens. 2015, 7, 13251–13272. [Google Scholar] [CrossRef]
  25. Gao, S.; Niu, Z.; Huang, N.; Hou, X. Estimating the Leaf Area Index, height and biomass of maize using HJ-1 and RADARSAT-2. Int. J. Appl. Earth Obs. Geoinf. 2013, 24, 1–8. [Google Scholar] [CrossRef]
  26. Hoffmeister, D.; Bolten, A.; Curdt, C.; Waldhoff, G.; Bareth, G. High-resolution Crop Surface Models (CSM) and Crop Volume Models (CVM) on field level by terrestrial laser scanning. Int. Soc. Opt. Eng. 2009, 7840, 78400E. [Google Scholar] [CrossRef]
  27. Bendig, J.; Bolten, A.; Bennertz, S.; Broscheit, J.; Eichfuss, S.; Bareth, G. Estimating biomass of barley using crop surface models (CSMs) derived from UAV-based RGB imaging. Remote Sens. 2014, 6, 10395–10412. [Google Scholar] [CrossRef]
  28. Malambo, L.; Popescu, S.C.; Murray, S.C.; Putman, E.; Pugh, N.A.; Horne, D.W.; Richardson, G.; Sheridan, R.; Rooney, W.L.; Avant, R.; et al. Multitemporal field-based plant height estimation using 3D point clouds generated from small unmanned aerial systems high-resolution imagery. Int. J. Appl. Earth Obs. Geoinf. 2018, 64, 31–42. [Google Scholar] [CrossRef]
  29. Holman, F.H.; Riche, A.B.; Michalski, A.; Castle, M.; Wooster, M.J.; Hawkesford, M.J. High throughput field phenotyping of wheat plant height and growth rate in field plot trials using UAV based remote sensing. Remote Sens. 2016, 8. [Google Scholar] [CrossRef]
  30. Possoch, M.; Bieker, S.; Hoffmeister, D.; Bolten, A.A.; Schellberg, J.; Bareth, G. Multi-Temporal crop surface models combined with the rgb vegetation index from UAV-based images for forage monitoring in grassland. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. SPRS Arch. 2016, 991–998. [Google Scholar] [CrossRef]
  31. Meyer, G.E.; Neto, J.C. Verification of color vegetation indices for automated crop imaging applications. Comput. Electron. Agric. 2008, 63, 282–293. [Google Scholar] [CrossRef]
  32. Gitelson, A.A.; Kaufman, Y.J.; Stark, R.; Rundquist, D. Novel algorithms for remote estimation of vegetation fraction. Remote Sens. Environ. 2002, 80, 76–87. [Google Scholar] [CrossRef] [Green Version]
  33. Zarco-Tejada, P.J.; Berjón, A.; López-Lozano, R.; Miller, J.R.; Martín, P.; Cachorro, V.; González, M.R.; De Frutos, A. Assessing vineyard condition with hyperspectral indices: Leaf and canopy reflectance simulation in a row-structured discontinuous canopy. Remote Sens. Environ. 2005, 99, 271–287. [Google Scholar] [CrossRef]
  34. Datt, B. A new reflectance index for remote sensing of chlorophyll content in higher plants: Tests using Eucalyptus leaves. J. Plant Physiol. 1999, 154, 30–36. [Google Scholar] [CrossRef]
  35. Peñuelas, J.; Gamon, J.A.; Fredeen, A.L.; Merino, J.; Field, C.B. Reflectance indices associated with physiological changes in nitrogen- and water-limited sunflower leaves. Remote Sens. Environ. 1994, 48, 135–146. [Google Scholar] [CrossRef]
  36. Vincini, M.; Frazzi, E. Angular dependence of maize and sugar beet VIs from directional CHRIS/Proba data. Cuore 2005, 5–9. [Google Scholar]
  37. Daughtry, C.S.T.; Walthall, C.L.; Kim, M.S.; De Colstoun, E.B.; McMurtrey, J.E. Estimating corn leaf chlorophyll concentration from leaf and canopy reflectance. Remote Sens. Environ. 2000, 74, 229–239. [Google Scholar] [CrossRef]
  38. Wold, H. Estimation of principal components and related models by iterative least squares. In Multivariate Analysis; Academic Press: New York, NY, USA, 1966; pp. 1391–1420. ISBN 0471411256. [Google Scholar]
  39. Breiman, L. Random forests. Mach. Learn. 2001, 45, 5–32. [Google Scholar] [CrossRef]
  40. Wang, J.; Zhao, C.; Huang, W. Fundamental and Application of Quantitative Remote Sensing in Agriculture; Science China Press: Beijing, China, 2008. [Google Scholar]
  41. Nguy-Robertson, A.; Gitelson, A.; Peng, Y.; Viña, A.; Arkebauer, T.; Rundquist, D. Green leaf area index estimation in maize and soybean: Combining vegetation indices to achieve maximal sensitivity. Agron. J. 2012, 104, 1336–1347. [Google Scholar] [CrossRef]
  42. Quan, X.; He, B.; Yebra, M.; Yin, C.; Liao, Z.; Zhang, X.; Li, X. A radiative transfer model-based method for the estimation of grassland aboveground biomass. Int. J. Appl. Earth Obs. Geoinf. 2017, 54, 159–168. [Google Scholar] [CrossRef]
  43. Bendig, J.; Bolten, A.; Bareth, G. UAV-based Imaging for Multi-Temporal, very high Resolution Crop Surface Models to monitor Crop Growth Variability. PFG 2013, 551–562. [Google Scholar] [CrossRef]
  44. Yue, J.; Yang, G.; Li, C.; Li, Z.; Wang, Y.; Feng, H.; Xu, B. Estimation of winter wheat above-ground biomass using unmanned aerial vehicle-based snapshot hyperspectral sensor and crop height improved models. Remote Sens. 2017, 9. [Google Scholar] [CrossRef]
  45. Sun, H.; Li, M.Z.; Zhao, Y.; Zhang, Y.E.; Wang, X.M.; Li, X.H. The Spectral Characteristics and Chlorophyll Content at Winter Wheat Growth Stages. Spectrosc. Spectr. Anal. 2010, 30, 192–196. [Google Scholar] [CrossRef]
  46. Jin, X.; Ma, J.; Wen, Z.; Song, K. Estimation of maize residue cover using Landsat-8 OLI image spectral information and textural features. Remote Sens. 2015, 7, 14559–14575. [Google Scholar] [CrossRef]
  47. Yue, J.; Yang, G.; Feng, H. Comparative of remote sensing estimation models of winter wheat biomass based on random forest algorithm. Nongye Gongcheng Xuebao/Trans. Chin. Soc. Agric. Eng. 2016, 32, 175–182. [Google Scholar] [CrossRef]
  48. Farifteh, J.; Van der Meer, F.; Atzberger, C.; Carranza, E.J.M. Quantitative analysis of salt-affected soil reflectance spectra: A comparison of two adaptive methods (PLSR and ANN). Remote Sens. Environ. 2007, 110, 59–78. [Google Scholar] [CrossRef]
Figure 1. Location of study area and experimental design: (a) location of study area in China; (b) map showing Changping District in Beijing City; (c) design of treatments and images of ground-measurement field acquired from unmanned aerial vehicle mounted high-definition digital camera.
Figure 1. Location of study area and experimental design: (a) location of study area in China; (b) map showing Changping District in Beijing City; (c) design of treatments and images of ground-measurement field acquired from unmanned aerial vehicle mounted high-definition digital camera.
Remotesensing 10 01138 g001
Figure 2. UAV-UHD hyperspectral images and corresponding crop height on (a,b) 21 April; (c,d) 26 April; and (e,f) 13 May 2015.
Figure 2. UAV-UHD hyperspectral images and corresponding crop height on (a,b) 21 April; (c,d) 26 April; and (e,f) 13 May 2015.
Remotesensing 10 01138 g002
Figure 3. Digital camera images and corresponding crop height on (a,b) 21 April; (c,d) 26 April; and (e,f) 13 May 2015.
Figure 3. Digital camera images and corresponding crop height on (a,b) 21 April; (c,d) 26 April; and (e,f) 13 May 2015.
Remotesensing 10 01138 g003
Figure 4. Averaged hyperspectral reflectance spectra, averaged DN values, and crop height in the three growing stages: (a) G-hyperspectral; (b) UHD-hyperspectral; (c) calibrated DC-DN values; (d) G-height; (e) UHD-height; (f) DC height. G- indicates data measured by ground-based ASD spectrometer and measuring stick; UHD- indicates data measured using the UHD 185 mounted on the UAV; and DC- indicates data measured by using the digital camera mounted on the UAV.
Figure 4. Averaged hyperspectral reflectance spectra, averaged DN values, and crop height in the three growing stages: (a) G-hyperspectral; (b) UHD-hyperspectral; (c) calibrated DC-DN values; (d) G-height; (e) UHD-height; (f) DC height. G- indicates data measured by ground-based ASD spectrometer and measuring stick; UHD- indicates data measured using the UHD 185 mounted on the UAV; and DC- indicates data measured by using the digital camera mounted on the UAV.
Remotesensing 10 01138 g004
Figure 5. Pearson correlation coefficient between VIs, crop height (H), AGB and LAI: (a) G-Vis; (b) UHD-Vis; (c) DC-VIs.
Figure 5. Pearson correlation coefficient between VIs, crop height (H), AGB and LAI: (a) G-Vis; (b) UHD-Vis; (c) DC-VIs.
Remotesensing 10 01138 g005
Figure 6. Relationship between best VIs and AGB: (a) G-LCI; (b) UHD-LCI; (c) DC-r; (d) G-height; (e) UHD-height; (f) DC-height; (g) G-height × LCI; (h) UHD-height × LCI; (i) DC-height × r.
Figure 6. Relationship between best VIs and AGB: (a) G-LCI; (b) UHD-LCI; (c) DC-r; (d) G-height; (e) UHD-height; (f) DC-height; (g) G-height × LCI; (h) UHD-height × LCI; (i) DC-height × r.
Remotesensing 10 01138 g006
Figure 7. Relationship between best VIs and LAI: (a) G-LCI; (b) UHD-LCI; (c) DC-B; (d) G-height; (e) UHD-height; (f) DC-height; (g) G-height × LCI; (h) G-height × LCI; (i) DC-height × B.
Figure 7. Relationship between best VIs and LAI: (a) G-LCI; (b) UHD-LCI; (c) DC-B; (d) G-height; (e) UHD-height; (f) DC-height; (g) G-height × LCI; (h) G-height × LCI; (i) DC-height × B.
Remotesensing 10 01138 g007
Figure 8. Relationship between the predicted and measured winter wheat AGB (t/ha): (a) G-height, VIs, and PLSR; (b) G-height, VIs, and RF; (c) G-VIs and PLSR; (d) G-VIs and RF; (e) UHD-height, VIs, and PLSR; (f) UHD-height, VIs, and RF; (g) UHD-VIs and PLSR; (h) UHD-VIs and RF; (i) DC-height, VIs, and PLSR; (j) DC-height, VIs, and RF; (k) DC-VIs and PLSR; (l) DC-VIs and RF (validation dataset, mean AGB = 4.58 t/ha).
Figure 8. Relationship between the predicted and measured winter wheat AGB (t/ha): (a) G-height, VIs, and PLSR; (b) G-height, VIs, and RF; (c) G-VIs and PLSR; (d) G-VIs and RF; (e) UHD-height, VIs, and PLSR; (f) UHD-height, VIs, and RF; (g) UHD-VIs and PLSR; (h) UHD-VIs and RF; (i) DC-height, VIs, and PLSR; (j) DC-height, VIs, and RF; (k) DC-VIs and PLSR; (l) DC-VIs and RF (validation dataset, mean AGB = 4.58 t/ha).
Remotesensing 10 01138 g008
Figure 9. Relationship between predicted and measured winter wheat LAI (m2/m2): (a) G-height, VIs, and PLSR; (b) G-height, VIs, and RF; (c) G-VIs and PLSR; (d) G-VIs and RF; (e) UHD-height, VIs, and PLSR; (f) UHD-height, VIs, and RF; (g) UHD-VIs and PLSR; (h) UHD-VIs and RF; (i) DC-height, VIs, and PLSR; (j) DC-height, VIs, and RF; (k) DC-VIs and PLSR; (l) DC-VIs and RF (validation dataset, mean LAI = 3.57 m2/m2).
Figure 9. Relationship between predicted and measured winter wheat LAI (m2/m2): (a) G-height, VIs, and PLSR; (b) G-height, VIs, and RF; (c) G-VIs and PLSR; (d) G-VIs and RF; (e) UHD-height, VIs, and PLSR; (f) UHD-height, VIs, and RF; (g) UHD-VIs and PLSR; (h) UHD-VIs and RF; (i) DC-height, VIs, and PLSR; (j) DC-height, VIs, and RF; (k) DC-VIs and PLSR; (l) DC-VIs and RF (validation dataset, mean LAI = 3.57 m2/m2).
Remotesensing 10 01138 g009aRemotesensing 10 01138 g009b
Figure 10. Above-ground Biomass (t/ha) maps made using UAV-UHD, UAV-DC, and PLSR. (a) UHD on 21 April; (b) UHD on 26 April; (c) UHD on 13 May; (d) DC on 21 April; (e) DC on 26 April; and (f) DC on 13 May. Note: UHD- indicates data measured using the snapshot hyperspectral sensor mounted on the UAV; and DC- indicates data measured by using the digital camera mounted on the UAV.
Figure 10. Above-ground Biomass (t/ha) maps made using UAV-UHD, UAV-DC, and PLSR. (a) UHD on 21 April; (b) UHD on 26 April; (c) UHD on 13 May; (d) DC on 21 April; (e) DC on 26 April; and (f) DC on 13 May. Note: UHD- indicates data measured using the snapshot hyperspectral sensor mounted on the UAV; and DC- indicates data measured by using the digital camera mounted on the UAV.
Remotesensing 10 01138 g010aRemotesensing 10 01138 g010b
Figure 11. Leaf Area Index (m2/m2) maps based on UAV-UHD and UAV-DC images and with PLSR. (a) UHD on 21 April; (b) UHD on 26 April; (c) UHD on 13 May; (d) DC on 21 April; (e) DC on 26 April; (f) DC on 13 May. Note: UHD- indicates data measured using the snapshot hyperspectral sensor mounted on the UAV; and DC- indicates data measured by using the digital camera mounted on the UAV.
Figure 11. Leaf Area Index (m2/m2) maps based on UAV-UHD and UAV-DC images and with PLSR. (a) UHD on 21 April; (b) UHD on 26 April; (c) UHD on 13 May; (d) DC on 21 April; (e) DC on 26 April; (f) DC on 13 May. Note: UHD- indicates data measured using the snapshot hyperspectral sensor mounted on the UAV; and DC- indicates data measured by using the digital camera mounted on the UAV.
Remotesensing 10 01138 g011
Table 1. Sensor parameters and abbreviations used in this study.
Table 1. Sensor parameters and abbreviations used in this study.
PlatformsGroundUAV
Sensor TypesSpectrometerSnapshot SpectrometerDigital camera
Sensor NamesASD FieldSpec 3UHD 185Sony DSC–QX100
Field of view25°19°64°
Image size-1000 × 10005472 × 3648
Working height1.3 m50 m50 m
Spectral information350~2500 nm450–950 nmR, G, B
Original spectral resolution3 nm @ 700 nm;
8.5 nm @ 1400 nm;
6.5 nm @ 2100 nm
8 nm @ 532 nm-
Data spectral resolution1 nm4 nmRed, Green and Blue band
Image spatial resolution-2.16 × 2.43 cm1.11 × 1.11 cm
HeightSteel tape rulerPhotogrammetry methodPhotogrammetry method
Crop Height AbbreviationsG-heightUHD-heightDC-height
Note: G- indicates crop height measured by a steel tape ruler; UHD- indicates crop height measured using the snapshot hyperspectral sensor mounted on the UAV; and DC- indicates crop height measured by using the digital camera mounted on the UAV.
Table 2. Summary of bands and vegetation indices (VIs) used in this study.
Table 2. Summary of bands and vegetation indices (VIs) used in this study.
SensorsHigh-Definition Digital CameraHyperspectral Sensor
TypeVIsEquationVIsEquation
Spectral informationRDNrB460460 nm of hyperspectral reflectance
GDNgB560560 nm of hyperspectral reflectance
BDNbB670670 nm of hyperspectral reflectance
rR/(R + G + B)B800800 nm of hyperspectral reflectance
gG/(R + G + B)BGIB460/B560
bB/(R + G + B)NDVI(B800 − B670)/(B800 + B670)
B/RB/RLCI(B850 − B710)/(B850 + B670)
B/GB/GNPCI(B670 − B460)/(B670 + B460)
R/GR/GEVI22.5 × (B800 − B670)/(B800 + 2.4 × B670 + 1)
EXR1.4 × r − gOSAVI1.16 × (B800 − B670)/(B800 + B670 + 0.16)
VARI(g − r)/(g + r − b)SPVI0.4 × (3.7(B800 − B670) − 1.2 ×|B530 − B670|)
GRVI(g − r)/(g + r)MCARI((B700 − B670) − 0.2×(B700 − B560))/(B700/B670)
DSM informationCrop HeightPhotogrammetry methodCrop HeightPhotogrammetry method
Note: B460, B560, B670, and B800 represent bands at 460, 560, 670, and 800 nm of winter wheat canopy hyperspectral reflectance. Vis: vegetation indices; DSM: digital surface model; BGI: bare ground index; NDVI: normalized difference vegetation index; LCI: linear combination index; NPCI: normalized pigment chlorophyll ratio index; EVI2: two-band enhanced vegetation index; OSAVI: optimize soil regulation vegetation index; SPVI: spectral polygon vegetation index; MCARI: modified chlorophyll absorption reflectivity index; EXR: excess red; VARI: visible atmospherically resistant index; GRVI: green-red vegetation index.
Table 3. Descriptive statistics of above-ground biomass (AGB, t/ha), and leaf area index (LAI, m2/m2), and crop height (cm) from the study area.
Table 3. Descriptive statistics of above-ground biomass (AGB, t/ha), and leaf area index (LAI, m2/m2), and crop height (cm) from the study area.
DatasetPeriodCrop VariablesSamplesMinMeanMaxStandard DeviationCoefficient of Variation (%)
CalibrationJointingAGB321.452.764.520.6724.28
LAI322.273.945.870.8822.34
Height3227.3333.5541.003.169.42
FlaggingAGB322.195.488.262.1138.50
LAI321.304.118.811.4735.77
Height3253.3367.1976.336.449.58
FloweringAGB324.398.0712.731.8823.30
LAI321.243.245.891.1334.88
Height3258.0071.3884.336.629.27
ValidationJointingAGB161.202.163.050.5023.15
LAI161.692.984.390.6822.82
Height1625.6729.1834.332.117.23
FlaggingAGB162.214.366.011.2428.44
LAI161.724.336.631.5335.33
Height1653.3362.3770.665.128.21
FloweringAGB163.417.2110.561.9827.46
LAI161.343.395.531.2737.46
Height1655.0069.6082.337.5010.78
Table 4. Relationship between crop parameters and spectra acquired during different growing stages.
Table 4. Relationship between crop parameters and spectra acquired during different growing stages.
Crop ParametersRegression Equation Methods
EquationsR2MAERMSE
JointingAGBG-MCARI y = 297.22 x + 6.31 0.660.300.39
DC-G y = 16.394 × e 0.02 x 0.640.320.42
UHD-LCI y = 0.6327 × e 2.2988 x 0.500.380.49
LAIG-MCARI y = −374.23x + 8.4060.620.440.54
DC-G y = 19.065 × e 0.018 x 0.630.470.60
UHD-LCI y = 1.1283 × e 1.9504 x 0.450.510.68
FlaggingAGBG-LCI y = 0.8174 × e 2.6029 x 0.670.750.84
DC-b y = 713.21 × e 14.92 x 0.570.860.97
UHD-LCI y = 0.641 × e 3.2226 x 0.582.612.85
LAIG-LCI y = 0.2101 × e 4.0378 x 0.780.880.55
DC-b y = 13327 × e 24.82 x 0.760.880.47
UHD-NPCI y = 2.8019 × e 4.827 x 0.740.920.51
FloweringAGBG-SPVI y = 2.7804 × e 2.3615 x 0.670.871.12
DC-b y = 2762.4 × e 17.77 x 0.740.861.19
UHD-BGI y = 0.1356 × e 6.7115 x 0.670.931.18
LAIG-SPVI y = 0.5733 × e 3.7940 x 0.720.550.66
DC-b y = 39963 × e 28.74 x 0.810.470.65
UHD-MCARI y = 39.998 × e 210.9 x 0.720.510.67
Total for three stagesAGBG-LCI y = 0.6274 × e 2.9275 x 0.261.882.25
DC-r y = 0.0033 × e 26.549 x 0.671.191.71
UHD-LCI y = 0.4263 × e 3.7273 x 0.301.842.18
LAIG-LCI y = 0.4182 × e 3.0703 x 0.590.670.88
DC-B y = 15.549 × e 0.018 x 0.550.700.90
UHD-LCI y = 0.4307 × e 3.2404 x 0.460.801.01
Note: AGB: above-ground biomass; LAI: leaf area index; MAE: mean absolute error; RMSE: root mean square error; G- indicates data measured by a steel tape ruler; UHD- indicates data measured using the snapshot hyperspectral sensor mounted on the UAV; and DC- indicates data measured by using the digital camera mounted on the UAV.
Table 5. Relationship between crop parameters and crop height in different growing stages.
Table 5. Relationship between crop parameters and crop height in different growing stages.
Crop ParametersRegression Equation Methods
HeightEquationsR2MAERMSE
JointingAGBG-height y = 0.0296 × x + 1.7752 0.020.540.67
DC-height y = 0.0184 × x + 2.5238 0.030.560.67
UHD-height y = 0.0399 × x + 1.7712 0.170.510.62
LAIG-height y = 0.0138 × x + 3.482 0.010.680.88
DC-height y = 0.0391 × x + 3.3631 0.080.680.89
UHD-height y = 0.0437 × x + 2.8524 0.120.630.83
FlaggingAGBG-height y = 0.126 × x 2.9822 0.240.961.25
DC-height y = 0.0967 × x + 1.3464 0.301.121.36
UHD-height y = 0.1082 × x + 1.3362 0.290.961.20
LAIG-height y = 0.1708 × x 7.361 0.330.981.34
DC-height y = 0.0968 × x 0.2649 0.401.001.34
UHD-height y = 0.1491 × x 1.6028 0.420.961.25
FloweringAGBG-height y = 0.1464 × x 2.3713 0.271.31.61
DC-height y = 0.1522 × x 1.3578 0.361.261.61
UHD-height y = 0.1602 × x 0.0044 0.381.211.53
LAIG-height y =   0.081 × x 2.5379 0.220.781.00
DC-height y = 0.0872 × x 2.1487 0.330.760.97
UHD-height y = 0.0905 × x 1.3123 0.330.740.94
Total for three stagesAGBG-height y = 1.1613 × e 0.0248 x 0.731.081.49
DC-height y = 0.1074 × x + 1.2048 0.731.001.29
UHD-height y = 0.1736 × x 1.1141 0.711.041.39
LAIG-height y = 0.0009 × x + 3.7193 0.011.011.31
DC-height y = 0.0003 × x + 3.783 0.011.011.31
UHD-height y = 0.0133 × x + 3.2676 0.021.011.30
Note: AGB: above-ground biomass; LAI: leaf area index; MAE: mean absolute error; RMSE: root mean square error; G- indicates data measured by a steel tape ruler; UHD- indicates data measured using the snapshot hyperspectral sensor mounted on the UAV; and DC- indicates data measured by using the digital camera mounted on the UAV.
Table 6. Relationship between crop parameters and spectral VIs and crop height in different growing stages.
Table 6. Relationship between crop parameters and spectral VIs and crop height in different growing stages.
Crop ParametersRegression Equation Methods
InformationEquationsR2MAERMSE
JointingAGBG-height, MCARI y = 0.0009 × x 1 / x 2 + 0.2503 0.560.380.46
DC-height, G y = 0.642 × x 1 / x 2 + 2.6632 0.110.550.67
UHD-height, LCI y = 0.0534 × x 1 × x 2 + 1.9108 0.190.500.61
LAIG-height, MCARI y = 0.001 × x 1 / x 2 + 0.949 0.470.540.65
DC-height, G y = 0.9607 × x 1 / x 2 + 3.7859 0.010.670.88
UHD-height, LCI y = 2.9786 × e 0.0159 x 1 × x 2 0.160.610.83
FlaggingAGBG-height, LCI y = 1.4661 × e 0.0264 x 1 × x 2 0.610.720.91
DC-height, b y = 2.5554 × e 0.0055 x 1 / x 2 0.301.091.27
UHD-height, LCI y = 2.5202 × e 0.0286 x 1 × x 2 0.380.981.20
LAIG-height, LCI y = 0.4651 × e 0.0433 x 1 × x 2 0.790.600.86
DC-height, b y = 1.0879 × e 0.0094 x 1 / x 2 0.431.071.41
UHD-height, NPCI y = 3.0398 × e 160.5 x 1 / x 2 0.680.801.06
FloweringAGBG-height, SPVI y = 3.8639 × e 0.0223 x 1 × x 2 0.591.001.26
DC-height, b y = 0.0366 × x 1 / x 2 + 1.2362 0.371.201.49
UHD-height, BGI y = 4.42 × e 0.0292 x 1 × x 2 0.491.121.42
LAIG-height, SPVI y = 0.9984 × e 0.035 x 1 × x 2 0.600.650.79
DC-height, b y = 0.7342 × e 0.0076   x 1 / x 2 0.420.710.90
UHD-height, MCARI y = 0.7491 × e 0.0003   x 1 / x 2 0.710.640.84
Total for three stagesAGBG-height, LCI y = 1.3628 × e 0.031 x 1 × x 2 0.810.991.32
DC-height, r y = 2.1193 × e 0.0728 x 1 × x 2 0.771.021.30
UHD-height, LCI y = 0.2265 × x 1 × x 2 0.3245 0.741.051.38
LAIG-height, LCI y = 0.0239 × x 1 × x 2 + 2.8036 0.071.001.27
DC-height, B y = 0.9735 × x 1 / x 2 + 3.2818 0.041.011.29
UHD-height, LCI y = 0.033 × x 1 × x 2 + 2.9357 0.061.001.27
Note: AGB: above-ground biomass; LAI: leaf area index; MAE: mean absolute error; RMSE: root mean square error; G- indicates data measured by a steel tape ruler; UHD- indicates data measured using the snapshot hyperspectral sensor mounted on the UAV; and DC- indicates data measured by using the digital camera mounted on the UAV.
Table 7. Leaf area index and above-ground biomass estimates based on all vegetation indices from calibration dataset and using partial least square regression and random forest methods (calibration dataset, mean AGB = 5.44 t/ha, mean LAI = 3.77 m2/m2).
Table 7. Leaf area index and above-ground biomass estimates based on all vegetation indices from calibration dataset and using partial least square regression and random forest methods (calibration dataset, mean AGB = 5.44 t/ha, mean LAI = 3.77 m2/m2).
MethodsDataAGB (t/ha)LAI (m2/m2)
R2MAEnRMSE (%)RMSER2MAEnRMSE (%)RMSE
RFG-VIs0.930.5814.140.770.940.268.750.33
DC-VIs0.930.5213.220.720.940.289.550.36
UHD-VIs0.930.6514.700.800.930.3210.610.40
PLSRG-VIs0.351.7538.392.090.580.6422.810.86
DC-VIs0.611.1929.761.620.500.7324.670.93
UHD-VIs0.341.7738.582.100.450.8025.730.97
Note: AGB: above-ground biomass; LAI: leaf area index; MAE: mean absolute error; RMSE: root mean square error; G- indicates data measured by a steel tape ruler; UHD- indicates data measured using the snapshot hyperspectral sensor mounted on the UAV; and DC- indicates data measured by using the digital camera mounted on the UAV.
Table 8. Estimates of leaf area index and above-ground biomass based on crop height, using all vegetation indices and using partial least square regression and random forest methods (calibration dataset, mean AGB = 5.44 t/ha, mean LAI = 3.77 m2/m2).
Table 8. Estimates of leaf area index and above-ground biomass based on crop height, using all vegetation indices and using partial least square regression and random forest methods (calibration dataset, mean AGB = 5.44 t/ha, mean LAI = 3.77 m2/m2).
MethodsDataAGB (t/ha)LAI (m2/m2)
R2MAEnRMSE (%)RMSER2MAEnRMSE (%)RMSE
RFG-height, VIs0.960.4010.470.570.950.248.220.31
DC-height, VIs0.960.4210.100.550.940.279.540.36
UHD-height, VIs0.940.5513.040.710.940.3110.340.39
PLSRG-height, VIs0.810.8320.581.120.650.5620.420.77
DC-height, VIs0.770.9522.971.250.520.7224.130.91
UHD-height, VIs0.641.2628.481.550.470.7925.460.96
Note: AGB: above-ground biomass; LAI: leaf area index; MAE: mean absolute error; RMSE: root mean square error; G- indicates data measured by a steel tape ruler; UHD- indicates data measured using the snapshot hyperspectral sensor mounted on the UAV; and DC- indicates data measured by using the digital camera mounted on the UAV.

Share and Cite

MDPI and ACS Style

Yue, J.; Feng, H.; Jin, X.; Yuan, H.; Li, Z.; Zhou, C.; Yang, G.; Tian, Q. A Comparison of Crop Parameters Estimation Using Images from UAV-Mounted Snapshot Hyperspectral Sensor and High-Definition Digital Camera. Remote Sens. 2018, 10, 1138. https://0-doi-org.brum.beds.ac.uk/10.3390/rs10071138

AMA Style

Yue J, Feng H, Jin X, Yuan H, Li Z, Zhou C, Yang G, Tian Q. A Comparison of Crop Parameters Estimation Using Images from UAV-Mounted Snapshot Hyperspectral Sensor and High-Definition Digital Camera. Remote Sensing. 2018; 10(7):1138. https://0-doi-org.brum.beds.ac.uk/10.3390/rs10071138

Chicago/Turabian Style

Yue, Jibo, Haikuan Feng, Xiuliang Jin, Huanhuan Yuan, Zhenhai Li, Chengquan Zhou, Guijun Yang, and Qingjiu Tian. 2018. "A Comparison of Crop Parameters Estimation Using Images from UAV-Mounted Snapshot Hyperspectral Sensor and High-Definition Digital Camera" Remote Sensing 10, no. 7: 1138. https://0-doi-org.brum.beds.ac.uk/10.3390/rs10071138

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop