remotesensing-logo

Journal Browser

Journal Browser

UAVs for Vegetation Monitoring

A special issue of Remote Sensing (ISSN 2072-4292). This special issue belongs to the section "Remote Sensing in Agriculture and Vegetation".

Deadline for manuscript submissions: closed (31 March 2021) | Viewed by 140135

Printed Edition Available!
A printed edition of this Special Issue is available here.

Special Issue Editors


E-Mail Website
Guest Editor
IMAPING Group—Remote Sensing for Precision Agriculture, Crop Protection, Institute for Sustainable Agriculture – CSIC (Spanish National Research Council), Campus Alameda del Obispo, 14004 Córdoba, Spain
Interests: remote sensing; crop monitoring; precision agriculture; weeds; plant disease; crop phenotyping; site-specific weed management; object-based image analysis (OBIA)
Biological Systems Engineering, University of Nebraska-Lincoln, 3605 Fair Street, NE 68583, USA
Interests: remote sensing; UAVs; data analysis; machine learning; abiotic and biotic stress monitoring and management

E-Mail Website
Guest Editor
Institute of Agricultural Sciences, CSIC, Plant Protection Department, 28006 Madrid, Spain
Interests: precision agriculture; UAV and satellite remote sensing; object-based image analysis (OBIA); digitization and sensors in agriculture; crop protection; weed mapping; sustainable agriculture
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
Center of Applied Artificial Intelligence for Sustainable Agriculture (CAAI4SA), South Carolina State University, Orangeburg, SC 29117, USA
Interests: sensors; automation; emerging technologies; small unmanned aircraft systems (sUAS) and robotics

Special Issue Information

Dear Colleagues,

Global crop production faces a major challenge regarding sustainability in the context of a rapidly growing world population and the gradual diminishing of natural resources. Remote sensing plays a fundamental role in changing the plant production model through the development of new technologies (robots, UAVs, sensors), making products more profitable and competitive and, also, more sustainable. Among the new advances, unmanned aerial vehicles (UAVs) equipped with perception systems have demonstrated suitability in the timely assessment and monitoring of vegetation. They can be operated at low altitudes, providing an ultra-high spatial resolution image, have great flexibility of flight scheduling for data collection at critical and desired moments and, also, the generation of digital surface models (DSMs) using highly overlapped images and photo-reconstruction techniques or artificial vision. Therefore, it is essential to advance the research for the technical configuration of UAVs, as well as to improve processing and analyzing of the UAV imagery of agricultural and forest scenarios in order to strengthen the knowledge of ecosystems and thereby improve farmers’ decision-making processes.

We encourage all members involved in the development and applications of UAVs to show their most recent findings focused on promising developments related to vegetation monitoring. This Special Issue welcomes original and innovative papers demonstrating the use of UAVs for remote sensing applications in the areas of agricultural, forestry, and natural resources managements. The selection of papers for publication will depend on quality and rigor of the research and paper. Specific topics include, but are not limited to:

  • UAV configuration and specifications for forest or agricultural applications;
  • Object- or pixel-based image analysis approaches for vegetation monitoring;
  • Artificial intelligence-based image-processing approaches;
  • Integration of UAV images with ground-based dataset or other remote and proximal measurements;
  • Biotic (weeds, disease) and abiotic (water, nutrition deficiencies) stress factors—sensing and modeling;
  • Crop yield estimation or prediction;
  • High-throughput phenotyping;
  • UAV-based prescription map development for site-specific management;
  • Precision agriculture applications;
  • UAV image pre-processing for radiometric, spectral and spatial calibration, and mosaicking;
  • Development, integration, and testing of new and emerging sensors and technologies for UAV-based crop management.

Dr. Ana de Castro Megías
Dr. Yeyin Shi
Dr. José M. Peña
Prof. Dr. Joe Maja
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Remote Sensing is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2700 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • UAV
  • crop mapping
  • image analysis
  • multispectral
  • hyperspectral
  • thermal
  • reflectance
  • classification
  • digital surface model
  • precision agriculture

Published Papers (22 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Review, Other

22 pages, 4106 KiB  
Article
Improved Estimation of Winter Wheat Aboveground Biomass Using Multiscale Textures Extracted from UAV-Based Digital Images and Hyperspectral Feature Analysis
by Yuanyuan Fu, Guijun Yang, Xiaoyu Song, Zhenhong Li, Xingang Xu, Haikuan Feng and Chunjiang Zhao
Remote Sens. 2021, 13(4), 581; https://0-doi-org.brum.beds.ac.uk/10.3390/rs13040581 - 06 Feb 2021
Cited by 58 | Viewed by 4747
Abstract
Rapid and accurate crop aboveground biomass estimation is beneficial for high-throughput phenotyping and site-specific field management. This study explored the utility of high-definition digital images acquired by a low-flying unmanned aerial vehicle (UAV) and ground-based hyperspectral data for improved estimates of winter wheat [...] Read more.
Rapid and accurate crop aboveground biomass estimation is beneficial for high-throughput phenotyping and site-specific field management. This study explored the utility of high-definition digital images acquired by a low-flying unmanned aerial vehicle (UAV) and ground-based hyperspectral data for improved estimates of winter wheat biomass. To extract fine textures for characterizing the variations in winter wheat canopy structure during growing seasons, we proposed a multiscale texture extraction method (Multiscale_Gabor_GLCM) that took advantages of multiscale Gabor transformation and gray-level co-occurrency matrix (GLCM) analysis. Narrowband normalized difference vegetation indices (NDVIs) involving all possible two-band combinations and continuum removal of red-edge spectra (SpeCR) were also extracted for biomass estimation. Subsequently, non-parametric linear (i.e., partial least squares regression, PLSR) and nonlinear regression (i.e., least squares support vector machine, LSSVM) analyses were conducted using the extracted spectral features, multiscale textural features and combinations thereof. The visualization technique of LSSVM was utilized to select the multiscale textures that contributed most to the biomass estimation for the first time. Compared with the best-performing NDVI (1193, 1222 nm), the SpeCR yielded higher coefficient of determination (R2), lower root mean square error (RMSE), and lower mean absolute error (MAE) for winter wheat biomass estimation and significantly alleviated the saturation problem after biomass exceeded 800 g/m2. The predictive performance of the PLSR and LSSVM regression models based on SpeCR decreased with increasing bandwidths, especially at bandwidths larger than 11 nm. Both the PLSR and LSSVM regression models based on the multiscale textures produced higher accuracies than those based on the single-scale GLCM-based textures. According to the evaluation of variable importance, the texture metrics “Mean” from different scales were determined as the most influential to winter wheat biomass. Using just 10 multiscale textures largely improved predictive performance over using all textures and achieved an accuracy comparable with using SpeCR. The LSSVM regression model based on the combination of the selected multiscale textures, and SpeCR with a bandwidth of 9 nm produced the highest estimation accuracy with R2val = 0.87, RMSEval = 119.76 g/m2, and MAEval = 91.61 g/m2. However, the combination did not significantly improve the estimation accuracy, compared to the use of SpeCR or multiscale textures only. The accuracy of the biomass predicted by the LSSVM regression models was higher than the results of the PLSR models, which demonstrated LSSVM was a potential candidate to characterize winter wheat biomass during multiple growth stages. The study suggests that multiscale textures derived from high-definition UAV-based digital images are competitive with hyperspectral features in predicting winter wheat biomass. Full article
(This article belongs to the Special Issue UAVs for Vegetation Monitoring)
Show Figures

Figure 1

24 pages, 7428 KiB  
Article
Applying RGB- and Thermal-Based Vegetation Indices from UAVs for High-Throughput Field Phenotyping of Drought Tolerance in Forage Grasses
by Tom De Swaef, Wouter H. Maes, Jonas Aper, Joost Baert, Mathias Cougnon, Dirk Reheul, Kathy Steppe, Isabel Roldán-Ruiz and Peter Lootens
Remote Sens. 2021, 13(1), 147; https://0-doi-org.brum.beds.ac.uk/10.3390/rs13010147 - 05 Jan 2021
Cited by 35 | Viewed by 7134
Abstract
The persistence and productivity of forage grasses, important sources for feed production, are threatened by climate change-induced drought. Breeding programs are in search of new drought tolerant forage grass varieties, but those programs still rely on time-consuming and less consistent visual scoring by [...] Read more.
The persistence and productivity of forage grasses, important sources for feed production, are threatened by climate change-induced drought. Breeding programs are in search of new drought tolerant forage grass varieties, but those programs still rely on time-consuming and less consistent visual scoring by breeders. In this study, we evaluate whether Unmanned Aerial Vehicle (UAV) based remote sensing can complement or replace this visual breeder score. A field experiment was set up to test the drought tolerance of genotypes from three common forage types of two different species: Festuca arundinacea, diploid Lolium perenne and tetraploid Lolium perenne. Drought stress was imposed by using mobile rainout shelters. UAV flights with RGB and thermal sensors were conducted at five time points during the experiment. Visual-based indices from different colour spaces were selected that were closely correlated to the breeder score. Furthermore, several indices, in particular H and NDLab, from the HSV (Hue Saturation Value) and CIELab (Commission Internationale de l’éclairage) colour space, respectively, displayed a broad-sense heritability that was as high or higher than the visual breeder score, making these indices highly suited for high-throughput field phenotyping applications that can complement or even replace the breeder score. The thermal-based Crop Water Stress Index CWSI provided complementary information to visual-based indices, enabling the analysis of differences in ecophysiological mechanisms for coping with reduced water availability between species and ploidy levels. All species/types displayed variation in drought stress tolerance, which confirms that there is sufficient variation for selection within these groups of grasses. Our results confirmed the better drought tolerance potential of Festuca arundinacea, but also showed which Lolium perenne genotypes are more tolerant. Full article
(This article belongs to the Special Issue UAVs for Vegetation Monitoring)
Show Figures

Graphical abstract

22 pages, 10901 KiB  
Article
Wheat Yellow Rust Detection Using UAV-Based Hyperspectral Technology
by Anting Guo, Wenjiang Huang, Yingying Dong, Huichun Ye, Huiqin Ma, Bo Liu, Wenbin Wu, Yu Ren, Chao Ruan and Yun Geng
Remote Sens. 2021, 13(1), 123; https://0-doi-org.brum.beds.ac.uk/10.3390/rs13010123 - 01 Jan 2021
Cited by 96 | Viewed by 10469
Abstract
Yellow rust is a worldwide disease that poses a serious threat to the safety of wheat production. Numerous studies on near-surface hyperspectral remote sensing at the leaf scale have achieved good results for disease monitoring. The next step is to monitor the disease [...] Read more.
Yellow rust is a worldwide disease that poses a serious threat to the safety of wheat production. Numerous studies on near-surface hyperspectral remote sensing at the leaf scale have achieved good results for disease monitoring. The next step is to monitor the disease at the field scale, which is of great significance for disease control. In our study, an unmanned aerial vehicle (UAV) equipped with a hyperspectral sensor was used to obtain hyperspectral images at the field scale. Vegetation indices (VIs) and texture features (TFs) extracted from the UAV-based hyperspectral images and their combination were used to establish partial least-squares regression (PLSR)-based disease monitoring models in different infection periods. In addition, we resampled the original images with 1.2 cm spatial resolution to images with different spatial resolutions (3 cm, 5 cm, 7 cm, 10 cm, 15 cm, and 20 cm) to evaluate the effect of spatial resolution on disease monitoring accuracy. The findings showed that the VI-based model had the highest monitoring accuracy (R2 = 0.75) in the mid-infection period. The TF-based model could be used to monitor yellow rust at the field scale and obtained the highest R2 in the mid- and late-infection periods (0.65 and 0.82, respectively). The VI-TF-based models had the highest accuracy in each infection period and outperformed the VI-based or TF-based models. The spatial resolution had a negligible influence on the VI-based monitoring accuracy, but significantly influenced the TF-based monitoring accuracy. Furthermore, the optimal spatial resolution for monitoring yellow rust using the VI-TF-based model in each infection period was 10 cm. The findings provide a reference for accurate disease monitoring using UAV hyperspectral images. Full article
(This article belongs to the Special Issue UAVs for Vegetation Monitoring)
Show Figures

Figure 1

20 pages, 4442 KiB  
Article
Predicting Tree Sap Flux and Stomatal Conductance from Drone-Recorded Surface Temperatures in a Mixed Agroforestry System—A Machine Learning Approach
by Florian Ellsäßer, Alexander Röll, Joyson Ahongshangbam, Pierre-André Waite, Hendrayanto, Bernhard Schuldt and Dirk Hölscher
Remote Sens. 2020, 12(24), 4070; https://0-doi-org.brum.beds.ac.uk/10.3390/rs12244070 - 12 Dec 2020
Cited by 16 | Viewed by 3914
Abstract
Plant transpiration is a key element in the hydrological cycle. Widely used methods for its assessment comprise sap flux techniques for whole-plant transpiration and porometry for leaf stomatal conductance. Recently emerging approaches based on surface temperatures and a wide range of machine learning [...] Read more.
Plant transpiration is a key element in the hydrological cycle. Widely used methods for its assessment comprise sap flux techniques for whole-plant transpiration and porometry for leaf stomatal conductance. Recently emerging approaches based on surface temperatures and a wide range of machine learning techniques offer new possibilities to quantify transpiration. The focus of this study was to predict sap flux and leaf stomatal conductance based on drone-recorded and meteorological data and compare these predictions with in-situ measured transpiration. To build the prediction models, we applied classical statistical approaches and machine learning algorithms. The field work was conducted in an oil palm agroforest in lowland Sumatra. Random forest predictions yielded the highest congruence with measured sap flux (r2 = 0.87 for trees and r2 = 0.58 for palms) and confidence intervals for intercept and slope of a Passing-Bablok regression suggest interchangeability of the methods. Differences in model performance are indicated when predicting different tree species. Predictions for stomatal conductance were less congruent for all prediction methods, likely due to spatial and temporal offsets of the measurements. Overall, the applied drone and modelling scheme predicts whole-plant transpiration with high accuracy. We conclude that there is large potential in machine learning approaches for ecological applications such as predicting transpiration. Full article
(This article belongs to the Special Issue UAVs for Vegetation Monitoring)
Show Figures

Graphical abstract

31 pages, 37608 KiB  
Article
Estimation of Nitrogen in Rice Crops from UAV-Captured Images
by Julian D. Colorado, Natalia Cera-Bornacelli, Juan S. Caldas, Eliel Petro, Maria C. Rebolledo, David Cuellar, Francisco Calderon, Ivan F. Mondragon and Andres Jaramillo-Botero
Remote Sens. 2020, 12(20), 3396; https://0-doi-org.brum.beds.ac.uk/10.3390/rs12203396 - 16 Oct 2020
Cited by 24 | Viewed by 5824
Abstract
Leaf nitrogen (N) directly correlates to chlorophyll production, affecting crop growth and yield. Farmers use soil plant analysis development (SPAD) devices to calculate the amount of chlorophyll present in plants. However, monitoring large-scale crops using SPAD is prohibitively time-consuming and demanding. This paper [...] Read more.
Leaf nitrogen (N) directly correlates to chlorophyll production, affecting crop growth and yield. Farmers use soil plant analysis development (SPAD) devices to calculate the amount of chlorophyll present in plants. However, monitoring large-scale crops using SPAD is prohibitively time-consuming and demanding. This paper presents an unmanned aerial vehicle (UAV) solution for estimating leaf N content in rice crops, from multispectral imagery. Our contribution is twofold: (i) a novel trajectory control strategy to reduce the angular wind-induced perturbations that affect image sampling accuracy during UAV flight, and (ii) machine learning models to estimate the canopy N via vegetation indices (VIs) obtained from the aerial imagery. This approach integrates an image processing algorithm using the GrabCut segmentation method with a guided filtering refinement process, to calculate the VIs according to the plots of interest. Three machine learning methods based on multivariable linear regressions (MLR), support vector machines (SVM), and neural networks (NN), were applied and compared through the entire phonological cycle of the crop: vegetative (V), reproductive (R), and ripening (Ri). Correlations were obtained by comparing our methods against an assembled ground-truth of SPAD measurements. The higher N correlations were achieved with NN: 0.98 (V), 0.94 (R), and 0.89 (Ri). We claim that the proposed UAV stabilization control algorithm significantly improves on the N-to-SPAD correlations by minimizing wind perturbations in real-time and reducing the need for offline image corrections. Full article
(This article belongs to the Special Issue UAVs for Vegetation Monitoring)
Show Figures

Graphical abstract

17 pages, 14092 KiB  
Article
Automatic Detection of Maize Tassels from UAV Images by Combining Random Forest Classifier and VGG16
by Xuli Zan, Xinlu Zhang, Ziyao Xing, Wei Liu, Xiaodong Zhang, Wei Su, Zhe Liu, Yuanyuan Zhao and Shaoming Li
Remote Sens. 2020, 12(18), 3049; https://0-doi-org.brum.beds.ac.uk/10.3390/rs12183049 - 18 Sep 2020
Cited by 28 | Viewed by 3978
Abstract
The tassel development status and its branch number in maize flowering stage are the key phenotypic traits to determine the growth process, pollen quantity of different maize varieties, and detasseling arrangement for seed maize production fields. Rapid and accurate detection of tassels is [...] Read more.
The tassel development status and its branch number in maize flowering stage are the key phenotypic traits to determine the growth process, pollen quantity of different maize varieties, and detasseling arrangement for seed maize production fields. Rapid and accurate detection of tassels is of great significance for maize breeding and seed production. However, due to the complex planting environment in the field, such as unsynchronized growth stage and tassels vary in size and shape caused by varieties, the detection of maize tassel remains challenging problem, and the existing methods also cannot distinguish the early tassels. In this study, based on the time series unmanned aerial vehicle (UAV) RGB images with maize flowering stage, we proposed an algorithm for automatic detection of maize tassels which is suitable for complex scenes by using random forest (RF) and VGG16. First, the RF was used to segment UAV images into tassel regions and non-tassel regions, and then extracted the potential tassel region proposals by morphological method; afterwards, false positives were removed through VGG16 network with the ratio of training set to validation set was 7:3. To demonstrate the performance of the proposed method, 50 plots were selected from UAV images randomly. The precision, recall rate and F1-score were 0.904, 0.979 and 0.94 respectively; 50 plots were divided into early, middle and late tasseling stages according to the proportion of tasseling plants and the morphology of tassels. The result of tassels detection was late tasseling stage > middle tasseling stage > early tasseling stage, and the corresponding F1-score were 0.962, 0.914 and 0.863, respectively. It was found that the model error mainly comes from the recognition of leaves vein and reflective leaves as tassels. Finally, to show the morphological characteristics of tassel directly, we proposed an endpoint detection method based on the tassel skeleton, and further extracted the tassel branch number. The method proposed in this paper can well detect tassels of different development stages, and support large scale tassels detection and branch number extraction. Full article
(This article belongs to the Special Issue UAVs for Vegetation Monitoring)
Show Figures

Graphical abstract

23 pages, 9525 KiB  
Article
Mask R-CNN Refitting Strategy for Plant Counting and Sizing in UAV Imagery
by Mélissande Machefer, François Lemarchand, Virginie Bonnefond, Alasdair Hitchins and Panagiotis Sidiropoulos
Remote Sens. 2020, 12(18), 3015; https://0-doi-org.brum.beds.ac.uk/10.3390/rs12183015 - 16 Sep 2020
Cited by 59 | Viewed by 7753
Abstract
This work introduces a method that combines remote sensing and deep learning into a framework that is tailored for accurate, reliable and efficient counting and sizing of plants in aerial images. The investigated task focuses on two low-density crops, potato and lettuce. This [...] Read more.
This work introduces a method that combines remote sensing and deep learning into a framework that is tailored for accurate, reliable and efficient counting and sizing of plants in aerial images. The investigated task focuses on two low-density crops, potato and lettuce. This double objective of counting and sizing is achieved through the detection and segmentation of individual plants by fine-tuning an existing deep learning architecture called Mask R-CNN. This paper includes a thorough discussion on the optimal parametrisation to adapt the Mask R-CNN architecture to this novel task. As we examine the correlation of the Mask R-CNN performance to the annotation volume and granularity (coarse or refined) of remotely sensed images of plants, we conclude that transfer learning can be effectively used to reduce the required amount of labelled data. Indeed, a previously trained Mask R-CNN on a low-density crop can improve performances after training on new crops. Once trained for a given crop, the Mask R-CNN solution is shown to outperform a manually-tuned computer vision algorithm. Model performances are assessed using intuitive metrics such as Mean Average Precision (mAP) from Intersection over Union (IoU) of the masks for individual plant segmentation and Multiple Object Tracking Accuracy (MOTA) for detection. The presented model reaches an mAP of 0.418 for potato plants and 0.660 for lettuces for the individual plant segmentation task. In detection, we obtain a MOTA of 0.781 for potato plants and 0.918 for lettuces. Full article
(This article belongs to the Special Issue UAVs for Vegetation Monitoring)
Show Figures

Graphical abstract

20 pages, 3703 KiB  
Article
Dynamic Influence Elimination and Chlorophyll Content Diagnosis of Maize Using UAV Spectral Imagery
by Lang Qiao, Dehua Gao, Junyi Zhang, Minzan Li, Hong Sun and Junyong Ma
Remote Sens. 2020, 12(16), 2650; https://0-doi-org.brum.beds.ac.uk/10.3390/rs12162650 - 17 Aug 2020
Cited by 34 | Viewed by 3751
Abstract
In order to improve the diagnosis accuracy of chlorophyll content in maize canopy, the remote sensing image of maize canopy with multiple growth stages was acquired by using an unmanned aerial vehicle (UAV) equipped with a spectral camera. The dynamic influencing factors of [...] Read more.
In order to improve the diagnosis accuracy of chlorophyll content in maize canopy, the remote sensing image of maize canopy with multiple growth stages was acquired by using an unmanned aerial vehicle (UAV) equipped with a spectral camera. The dynamic influencing factors of the canopy multispectral images of maize were removed by using different image segmentation methods. The chlorophyll content of maize in the field was diagnosed. The crop canopy spectral reflectance, coverage, and texture information are combined to discuss the different segmentation methods. A full-grown maize canopy chlorophyll content diagnostic model was created on the basis of the different segmentation methods. Results showed that different segmentation methods have variations in the extraction of maize canopy parameters. The wavelet segmentation method demonstrated better advantages than threshold and ExG index segmentation methods. This method segments the soil background, reduces the texture complexity of the image, and achieves satisfactory results. The maize canopy multispectral band reflectance and vegetation index were extracted on the basis of the different segmentation methods. A partial least square regression algorithm was used to construct a full-grown maize canopy chlorophyll content diagnostic model. The result showed that the model accuracy was low when the image background was not removed (Rc2 (the determination coefficient of calibration set) = 0.5431, RMSEF (the root mean squared error of forecast) = 4.2184, MAE (the mean absolute error) = 3.24; Rv2 (the determination coefficient of validation set) = 0.5894, RMSEP (the root mean squared error of prediction) = 4.6947, and MAE = 3.36). The diagnostic accuracy of the chlorophyll content could be improved by extracting the maize canopy through the segmentation method, which was based on the wavelet segmentation method. The maize canopy chlorophyll content diagnostic model had the highest accuracy (Rc2 = 0.6638, RMSEF = 3.6211, MAE = 2.89; Rv2 = 0.6923, RMSEP = 3.9067, and MAE = 3.19). The research can provide a feasible method for crop growth and nutrition monitoring on the basis of the UAV platform and has a guiding significance for crop cultivation management. Full article
(This article belongs to the Special Issue UAVs for Vegetation Monitoring)
Show Figures

Figure 1

26 pages, 7034 KiB  
Article
Drone Image Segmentation Using Machine and Deep Learning for Mapping Raised Bog Vegetation Communities
by Saheba Bhatnagar, Laurence Gill and Bidisha Ghosh
Remote Sens. 2020, 12(16), 2602; https://0-doi-org.brum.beds.ac.uk/10.3390/rs12162602 - 12 Aug 2020
Cited by 74 | Viewed by 12859
Abstract
The application of drones has recently revolutionised the mapping of wetlands due to their high spatial resolution and the flexibility in capturing images. In this study, the drone imagery was used to map key vegetation communities in an Irish wetland, Clara Bog, for [...] Read more.
The application of drones has recently revolutionised the mapping of wetlands due to their high spatial resolution and the flexibility in capturing images. In this study, the drone imagery was used to map key vegetation communities in an Irish wetland, Clara Bog, for the spring season. The mapping, carried out through image segmentation or semantic segmentation, was performed using machine learning (ML) and deep learning (DL) algorithms. With the aim of identifying the most appropriate, cost-efficient, and accurate segmentation method, multiple ML classifiers and DL models were compared. Random forest (RF) was identified as the best pixel-based ML classifier, which provided good accuracy (≈85%) when used in conjunction graph cut algorithm for image segmentation. Amongst the DL networks, a convolutional neural network (CNN) architecture in a transfer learning framework was utilised. A combination of ResNet50 and SegNet architecture gave the best semantic segmentation results (≈90%). The high accuracy of DL networks was accompanied with significantly larger labelled training dataset, computation time and hardware requirements compared to ML classifiers with slightly lower accuracy. For specific applications such as wetland mapping where networks are required to be trained for each different site, topography, season, and other atmospheric conditions, ML classifiers proved to be a more pragmatic choice. Full article
(This article belongs to the Special Issue UAVs for Vegetation Monitoring)
Show Figures

Graphical abstract

24 pages, 6149 KiB  
Article
Using UAV Collected RGB and Multispectral Images to Evaluate Winter Wheat Performance across a Site Characterized by Century-Old Biochar Patches in Belgium
by Ramin Heidarian Dehkordi, Victor Burgeon, Julien Fouche, Edmundo Placencia Gomez, Jean-Thomas Cornelis, Frederic Nguyen, Antoine Denis and Jeroen Meersmans
Remote Sens. 2020, 12(15), 2504; https://0-doi-org.brum.beds.ac.uk/10.3390/rs12152504 - 04 Aug 2020
Cited by 18 | Viewed by 5150
Abstract
Remote sensing data play a crucial role in monitoring crop dynamics in the context of precision agriculture by characterizing the spatial and temporal variability of crop traits. At present there is special interest in assessing the long-term impacts of biochar in agro-ecosystems. Despite [...] Read more.
Remote sensing data play a crucial role in monitoring crop dynamics in the context of precision agriculture by characterizing the spatial and temporal variability of crop traits. At present there is special interest in assessing the long-term impacts of biochar in agro-ecosystems. Despite the growing body of literature on monitoring the potential biochar effects on harvested crop yield and aboveground productivity, studies focusing on the detailed crop performance as a consequence of long-term biochar enrichment are still lacking. The primary objective of this research was to evaluate crop performance based on high-resolution unmanned aerial vehicle (UAV) imagery considering both crop growth and health through RGB and multispectral analysis, respectively. More specifically, this approach allowed monitoring of century-old biochar impacts on winter wheat crop performance. Seven Red-Green-Blue (RGB) and six multispectral flights were executed over 11 century-old biochar patches of a cultivated field. UAV-based RGB imagery exhibited a significant positive impact of century-old biochar on the evolution of winter wheat canopy cover (p-value = 0.00007). Multispectral optimized soil adjusted vegetation index indicated a better crop development over the century-old biochar plots at the beginning of the season (p-values < 0.01), while there was no impact towards the end of the season. Plant height, derived from the RGB imagery, was slightly higher for century-old biochar plots. Crop health maps were computed based on principal component analysis and k-means clustering. To our knowledge, this is the first attempt to quantify century-old biochar effects on crop performance during the entire growing period using remotely sensed data. Ground-based measurements illustrated a significant positive impact of century-old biochar on crop growth stages (p-value of 0.01265), whereas the harvested crop yield was not affected. Multispectral simplified canopy chlorophyll content index and normalized difference red edge index were found to be good linear estimators of harvested crop yield (p-value(Kendall) of 0.001 and 0.0008, respectively). The present research highlights that other factors (e.g., inherent pedological variations) are of higher importance than the presence of century-old biochar in determining crop health and yield variability. Full article
(This article belongs to the Special Issue UAVs for Vegetation Monitoring)
Show Figures

Figure 1

20 pages, 3611 KiB  
Article
A Plant-by-Plant Method to Identify and Treat Cotton Root Rot Based on UAV Remote Sensing
by Tianyi Wang, J. Alex Thomasson, Thomas Isakeit, Chenghai Yang and Robert L. Nichols
Remote Sens. 2020, 12(15), 2453; https://0-doi-org.brum.beds.ac.uk/10.3390/rs12152453 - 30 Jul 2020
Cited by 20 | Viewed by 3765
Abstract
Cotton root rot (CRR), caused by the fungus Phymatotrichopsis omnivora, is a destructive cotton disease that mainly affects the crop in Texas. Flutriafol fungicide applied at or soon after planting has been proven effective at protecting cotton plants from being infected by [...] Read more.
Cotton root rot (CRR), caused by the fungus Phymatotrichopsis omnivora, is a destructive cotton disease that mainly affects the crop in Texas. Flutriafol fungicide applied at or soon after planting has been proven effective at protecting cotton plants from being infected by CRR. Previous research has indicated that CRR will reoccur in the same regions of a field as in past years. CRR-infected plants can be detected with aerial remote sensing (RS). As unmanned aerial vehicles (UAVs) have been introduced into agricultural RS, the spatial resolution of farm images has increased significantly, making plant-by-plant (PBP) CRR classification possible. An unsupervised classification algorithm, PBP, based on the Superpixel concept, was developed to delineate CRR-infested areas at roughly the single-plant level. Five-band multispectral data were collected with a UAV to test these methods. The results indicated that the single-plant level classification achieved overall accuracy as high as 95.94%. Compared to regional classifications, PBP classification performed better in overall accuracy, kappa coefficient, errors of commission, and errors of omission. The single-plant fungicide application was also effective in preventing CRR. Full article
(This article belongs to the Special Issue UAVs for Vegetation Monitoring)
Show Figures

Graphical abstract

22 pages, 8043 KiB  
Article
Comparison of Object Detection and Patch-Based Classification Deep Learning Models on Mid- to Late-Season Weed Detection in UAV Imagery
by Arun Narenthiran Veeranampalayam Sivakumar, Jiating Li, Stephen Scott, Eric Psota, Amit J. Jhala, Joe D. Luck and Yeyin Shi
Remote Sens. 2020, 12(13), 2136; https://0-doi-org.brum.beds.ac.uk/10.3390/rs12132136 - 03 Jul 2020
Cited by 99 | Viewed by 7793
Abstract
Mid- to late-season weeds that escape from the routine early-season weed management threaten agricultural production by creating a large number of seeds for several future growing seasons. Rapid and accurate detection of weed patches in field is the first step of site-specific weed [...] Read more.
Mid- to late-season weeds that escape from the routine early-season weed management threaten agricultural production by creating a large number of seeds for several future growing seasons. Rapid and accurate detection of weed patches in field is the first step of site-specific weed management. In this study, object detection-based convolutional neural network models were trained and evaluated over low-altitude unmanned aerial vehicle (UAV) imagery for mid- to late-season weed detection in soybean fields. The performance of two object detection models, Faster RCNN and the Single Shot Detector (SSD), were evaluated and compared in terms of weed detection performance using mean Intersection over Union (IoU) and inference speed. It was found that the Faster RCNN model with 200 box proposals had similar good weed detection performance to the SSD model in terms of precision, recall, f1 score, and IoU, as well as a similar inference time. The precision, recall, f1 score and IoU were 0.65, 0.68, 0.66 and 0.85 for Faster RCNN with 200 proposals, and 0.66, 0.68, 0.67 and 0.84 for SSD, respectively. However, the optimal confidence threshold of the SSD model was found to be much lower than that of the Faster RCNN model, which indicated that SSD might have lower generalization performance than Faster RCNN for mid- to late-season weed detection in soybean fields using UAV imagery. The performance of the object detection model was also compared with patch-based CNN model. The Faster RCNN model yielded a better weed detection performance than the patch-based CNN with and without overlap. The inference time of Faster RCNN was similar to patch-based CNN without overlap, but significantly less than patch-based CNN with overlap. Hence, Faster RCNN was found to be the best model in terms of weed detection performance and inference time among the different models compared in this study. This work is important in understanding the potential and identifying the algorithms for an on-farm, near real-time weed detection and management. Full article
(This article belongs to the Special Issue UAVs for Vegetation Monitoring)
Show Figures

Figure 1

16 pages, 5156 KiB  
Article
Assessing the Operation Parameters of a Low-altitude UAV for the Collection of NDVI Values Over a Paddy Rice Field
by Rui Jiang, Pei Wang, Yan Xu, Zhiyan Zhou, Xiwen Luo, Yubin Lan, Genping Zhao, Arturo Sanchez-Azofeifa and Kati Laakso
Remote Sens. 2020, 12(11), 1850; https://0-doi-org.brum.beds.ac.uk/10.3390/rs12111850 - 08 Jun 2020
Cited by 15 | Viewed by 3324
Abstract
Unmanned aerial vehicle (UAV) remote sensing platforms allow for normalized difference vegetation index (NDVI) values to be mapped with a relatively high resolution, therefore enabling an unforeseeable ability to evaluate the influence of the operation parameters on the quality of the thus acquired [...] Read more.
Unmanned aerial vehicle (UAV) remote sensing platforms allow for normalized difference vegetation index (NDVI) values to be mapped with a relatively high resolution, therefore enabling an unforeseeable ability to evaluate the influence of the operation parameters on the quality of the thus acquired data. In order to better understand the effects of these parameters, we made a comprehensive evaluation on the effects of the solar zenith angle (SZA), the time of day (TOD), the flight altitude (FA) and the growth level of paddy rice at a pixel-scale on UAV-acquired NDVI values. Our results show that: (1) there was an inverse relationship between the FA (≤100 m) and the mean NDVI values, (2) TOD and SZA had a greater impact on UAV–NDVIs than the FA and the growth level; (3) Better growth levels of rice—measured using the NDVI—could reduce the effects of the FA, TOD and SZA. We expect that our results could be used to better plan flight campaigns that aim to collect NDVI values over paddy rice fields. Full article
(This article belongs to the Special Issue UAVs for Vegetation Monitoring)
Show Figures

Figure 1

19 pages, 4859 KiB  
Article
Closing the Phenotyping Gap: High Resolution UAV Time Series for Soybean Growth Analysis Provides Objective Data from Field Trials
by Irene Borra-Serrano, Tom De Swaef, Paul Quataert, Jonas Aper, Aamir Saleem, Wouter Saeys, Ben Somers, Isabel Roldán-Ruiz and Peter Lootens
Remote Sens. 2020, 12(10), 1644; https://0-doi-org.brum.beds.ac.uk/10.3390/rs12101644 - 20 May 2020
Cited by 35 | Viewed by 5061
Abstract
Close remote sensing approaches can be used for high throughput on-field phenotyping in the context of plant breeding and biological research. Data on canopy cover (CC) and canopy height (CH) and their temporal changes throughout the growing season can yield information about crop [...] Read more.
Close remote sensing approaches can be used for high throughput on-field phenotyping in the context of plant breeding and biological research. Data on canopy cover (CC) and canopy height (CH) and their temporal changes throughout the growing season can yield information about crop growth and performance. In the present study, sigmoid models were fitted to multi-temporal CC and CH data obtained using RGB imagery captured with a drone for a broad set of soybean genotypes. The Gompertz and Beta functions were used to fit CC and CH data, respectively. Overall, 90.4% fits for CC and 99.4% fits for CH reached an adjusted R2 > 0.70, demonstrating good performance of the models chosen. Using these growth curves, parameters including maximum absolute growth rate, early vigor, maximum height, and senescence were calculated for a collection of soybean genotypes. This information was also used to estimate seed yield and maturity (R8 stage) (adjusted R2 = 0.51 and 0.82). Combinations of parameter values were tested to identify genotypes with interesting traits. An integrative approach of fitting a curve to a multi-temporal dataset resulted in biologically interpretable parameters that were informative for relevant traits. Full article
(This article belongs to the Special Issue UAVs for Vegetation Monitoring)
Show Figures

Graphical abstract

17 pages, 3803 KiB  
Article
Segmenting Purple Rapeseed Leaves in the Field from UAV RGB Imagery Using Deep Learning as an Auxiliary Means for Nitrogen Stress Detection
by Jian Zhang, Tianjin Xie, Chenghai Yang, Huaibo Song, Zhao Jiang, Guangsheng Zhou, Dongyan Zhang, Hui Feng and Jing Xie
Remote Sens. 2020, 12(9), 1403; https://0-doi-org.brum.beds.ac.uk/10.3390/rs12091403 - 29 Apr 2020
Cited by 32 | Viewed by 4938
Abstract
Crop leaf purpling is a common phenotypic change when plants are subject to some biotic and abiotic stresses during their growth. The extraction of purple leaves can monitor crop stresses as an apparent trait and meanwhile contributes to crop phenotype analysis, monitoring, and [...] Read more.
Crop leaf purpling is a common phenotypic change when plants are subject to some biotic and abiotic stresses during their growth. The extraction of purple leaves can monitor crop stresses as an apparent trait and meanwhile contributes to crop phenotype analysis, monitoring, and yield estimation. Due to the complexity of the field environment as well as differences in size, shape, texture, and color gradation among the leaves, purple leaf segmentation is difficult. In this study, we used a U-Net model for segmenting purple rapeseed leaves during the seedling stage based on unmanned aerial vehicle (UAV) RGB imagery at the pixel level. With the limited spatial resolution of rapeseed images acquired by UAV and small object size, the input patch size was carefully selected. Experiments showed that the U-Net model with the patch size of 256 × 256 pixels obtained better and more stable results with a F-measure of 90.29% and an Intersection of Union (IoU) of 82.41%. To further explore the influence of image spatial resolution, we evaluated the performance of the U-Net model with different image resolutions and patch sizes. The U-Net model performed better compared with four other commonly used image segmentation approaches comprising support vector machine, random forest, HSeg, and SegNet. Moreover, regression analysis was performed between the purple rapeseed leaf ratios and the measured N content. The negative exponential model had a coefficient of determination (R²) of 0.858, thereby explaining much of the rapeseed leaf purpling in this study. This purple leaf phenotype could be an auxiliary means for monitoring crop growth status so that crops could be managed in a timely and effective manner when nitrogen stress occurs. Results demonstrate that the U-Net model is a robust method for purple rapeseed leaf segmentation and that the accurate segmentation of purple leaves provides a new method for crop nitrogen stress monitoring. Full article
(This article belongs to the Special Issue UAVs for Vegetation Monitoring)
Show Figures

Graphical abstract

34 pages, 18681 KiB  
Article
Influence of Model Grid Size on the Estimation of Surface Fluxes Using the Two Source Energy Balance Model and sUAS Imagery in Vineyards
by Ayman Nassar, Alfonso Torres-Rua, William Kustas, Hector Nieto, Mac McKee, Lawrence Hipps, David Stevens, Joseph Alfieri, John Prueger, Maria Mar Alsina, Lynn McKee, Calvin Coopmans, Luis Sanchez and Nick Dokoozlian
Remote Sens. 2020, 12(3), 342; https://0-doi-org.brum.beds.ac.uk/10.3390/rs12030342 - 21 Jan 2020
Cited by 27 | Viewed by 5430
Abstract
Evapotranspiration (ET) is a key variable for hydrology and irrigation water management, with significant importance in drought-stricken regions of the western US. This is particularly true for California, which grows much of the high-value perennial crops in the US. The advent [...] Read more.
Evapotranspiration (ET) is a key variable for hydrology and irrigation water management, with significant importance in drought-stricken regions of the western US. This is particularly true for California, which grows much of the high-value perennial crops in the US. The advent of small Unmanned Aerial System (sUAS) with sensor technology similar to satellite platforms allows for the estimation of high-resolution ET at plant spacing scale for individual fields. However, while multiple efforts have been made to estimate ET from sUAS products, the sensitivity of ET models to different model grid size/resolution in complex canopies, such as vineyards, is still unknown. The variability of row spacing, canopy structure, and distance between fields makes this information necessary because additional complexity processing individual fields. Therefore, processing the entire image at a fixed resolution that is potentially larger than the plant-row separation is more efficient. From a computational perspective, there would be an advantage to running models at much coarser resolutions than the very fine native pixel size from sUAS imagery for operational applications. In this study, the Two-Source Energy Balance with a dual temperature (TSEB2T) model, which uses remotely sensed soil/substrate and canopy temperature from sUAS imagery, was used to estimate ET and identify the impact of spatial domain scale under different vine phenological conditions. The analysis relies upon high-resolution imagery collected during multiple years and times by the Utah State University AggieAirTM sUAS program over a commercial vineyard located near Lodi, California. This project is part of the USDA-Agricultural Research Service Grape Remote Sensing Atmospheric Profile and Evapotranspiration eXperiment (GRAPEX). Original spectral and thermal imagery data from sUAS were at 10 cm and 60 cm per pixel, respectively, and multiple spatial domain scales (3.6, 7.2, 14.4, and 30 m) were evaluated and compared against eddy covariance (EC) measurements. Results indicated that the TSEB2T model is only slightly affected in the estimation of the net radiation (Rn) and the soil heat flux (G) at different spatial resolutions, while the sensible and latent heat fluxes (H and LE, respectively) are significantly affected by coarse grid sizes. The results indicated overestimation of H and underestimation of LE values, particularly at Landsat scale (30 m). This refers to the non-linear relationship between the land surface temperature (LST) and the normalized difference vegetation index (NDVI) at coarse model resolution. Another predominant reason for LE reduction in TSEB2T was the decrease in the aerodynamic resistance (Ra), which is a function of the friction velocity ( u * ) that varies with mean canopy height and roughness length. While a small increase in grid size can be implemented, this increase should be limited to less than twice the smallest row spacing present in the sUAS imagery. The results also indicated that the mean LE at field scale is reduced by 10% to 20% at coarser resolutions, while the with-in field variability in LE values decreased significantly at the larger grid sizes and ranged between approximately 15% and 45%. This implies that, while the field-scale values of LE are fairly reliable at larger grid sizes, the with-in field variability limits its use for precision agriculture applications. Full article
(This article belongs to the Special Issue UAVs for Vegetation Monitoring)
Show Figures

Graphical abstract

18 pages, 8878 KiB  
Article
A Comparative Study of RGB and Multispectral Sensor-Based Cotton Canopy Cover Modelling Using Multi-Temporal UAS Data
by Akash Ashapure, Jinha Jung, Anjin Chang, Sungchan Oh, Murilo Maeda and Juan Landivar
Remote Sens. 2019, 11(23), 2757; https://0-doi-org.brum.beds.ac.uk/10.3390/rs11232757 - 23 Nov 2019
Cited by 50 | Viewed by 5398
Abstract
This study presents a comparative study of multispectral and RGB (red, green, and blue) sensor-based cotton canopy cover modelling using multi-temporal unmanned aircraft systems (UAS) imagery. Additionally, a canopy cover model using an RGB sensor is proposed that combines an RGB-based vegetation index [...] Read more.
This study presents a comparative study of multispectral and RGB (red, green, and blue) sensor-based cotton canopy cover modelling using multi-temporal unmanned aircraft systems (UAS) imagery. Additionally, a canopy cover model using an RGB sensor is proposed that combines an RGB-based vegetation index with morphological closing. The field experiment was established in 2017 and 2018, where the whole study area was divided into approximately 1 x 1 m size grids. Grid-wise percentage canopy cover was computed using both RGB and multispectral sensors over multiple flights during the growing season of the cotton crop. Initially, the normalized difference vegetation index (NDVI)-based canopy cover was estimated, and this was used as a reference for the comparison with RGB-based canopy cover estimations. To test the maximum achievable performance of RGB-based canopy cover estimation, a pixel-wise classification method was implemented. Later, four RGB-based canopy cover estimation methods were implemented using RGB images, namely Canopeo, the excessive greenness index, the modified red green vegetation index and the red green blue vegetation index. The performance of RGB-based canopy cover estimation was evaluated using NDVI-based canopy cover estimation. The multispectral sensor-based canopy cover model was considered to be a more stable and accurately estimating canopy cover model, whereas the RGB-based canopy cover model was very unstable and failed to identify canopy when cotton leaves changed color after canopy maturation. The application of a morphological closing operation after the thresholding significantly improved the RGB-based canopy cover modeling. The red green blue vegetation index turned out to be the most efficient vegetation index to extract canopy cover with very low average root mean square error (2.94% for the 2017 dataset and 2.82% for the 2018 dataset), with respect to multispectral sensor-based canopy cover estimation. The proposed canopy cover model provides an affordable alternate of the multispectral sensors which are more sensitive and expensive. Full article
(This article belongs to the Special Issue UAVs for Vegetation Monitoring)
Show Figures

Graphical abstract

16 pages, 15109 KiB  
Article
Using Vegetation Indices and a UAV Imaging Platform to Quantify the Density of Vegetation Ground Cover in Olive Groves (Olea Europaea L.) in Southern Spain
by Francisco J. Lima-Cueto, Rafael Blanco-Sepúlveda, María L. Gómez-Moreno and Federico B. Galacho-Jiménez
Remote Sens. 2019, 11(21), 2564; https://0-doi-org.brum.beds.ac.uk/10.3390/rs11212564 - 01 Nov 2019
Cited by 32 | Viewed by 5564
Abstract
In olive groves, vegetation ground cover (VGC) plays an important ecological role. The EU Common Agricultural Policy, through cross-compliance, acknowledges the importance of this factor, but, to determine the real impact of VGC, it must first be quantified. Accordingly, in the present study, [...] Read more.
In olive groves, vegetation ground cover (VGC) plays an important ecological role. The EU Common Agricultural Policy, through cross-compliance, acknowledges the importance of this factor, but, to determine the real impact of VGC, it must first be quantified. Accordingly, in the present study, eleven vegetation indices (VIs) were applied to quantify the density of VGC in olive groves (Olea europaea L.), according to high spatial resolution (10–12 cm) multispectral images obtained by an unmanned aerial vehicle (UAV). The fieldwork was conducted in early spring, in a Mediterranean mountain olive grove in southern Spain presenting various VGC densities. A five-step method was applied: (1) generate image mosaics using UAV technology; (2) apply the VIs; (3) quantify VGC density by means of sampling plots (ground-truth); (4) calculate the mean reflectance of the spectral bands and of the VIs in each sampling plot; and (5) quantify VGC density according to the VIs. The most sensitive index was IRVI, which accounted for 82% (p < 0.001) of the variability of VGC density. The capability of the VIs to differentiate VGC densities increased in line with the cover interval range. RVI most accurately distinguished VGC densities > 80% in a cover interval range of 10% (p < 0.001), while IRVI was most accurate for VGC densities < 30% in a cover interval range of 15% (p < 0.01). IRVI, NRVI, NDVI, GNDVI and SAVI differentiated the complete series of VGC densities when the cover interval range was 30% (p < 0.001 and p < 0.05). Full article
(This article belongs to the Special Issue UAVs for Vegetation Monitoring)
Show Figures

Graphical abstract

18 pages, 19991 KiB  
Article
Estimating and Examining the Sensitivity of Different Vegetation Indices to Fractions of Vegetation Cover at Different Scaling Grids for Early Stage Acacia Plantation Forests Using a Fixed-Wing UAS
by Kotaro Iizuka, Tsuyoshi Kato, Sisva Silsigia, Alifia Yuni Soufiningrum and Osamu Kozan
Remote Sens. 2019, 11(15), 1816; https://0-doi-org.brum.beds.ac.uk/10.3390/rs11151816 - 03 Aug 2019
Cited by 18 | Viewed by 4831
Abstract
Understanding the information on land conditions and especially green vegetation cover is important for monitoring ecosystem dynamics. The fraction of vegetation cover (FVC) is a key variable that can be used to observe vegetation cover trends. Conventionally, satellite data are utilized to compute [...] Read more.
Understanding the information on land conditions and especially green vegetation cover is important for monitoring ecosystem dynamics. The fraction of vegetation cover (FVC) is a key variable that can be used to observe vegetation cover trends. Conventionally, satellite data are utilized to compute these variables, although computations in regions such as the tropics can limit the amount of available observation information due to frequent cloud coverage. Unmanned aerial systems (UASs) have become increasingly prominent in recent research and can remotely sense using the same methods as satellites but at a lower altitude. UASs are not limited by clouds and have a much higher resolution. This study utilizes a UAS to determine the emerging trends for FVC estimates at an industrial plantation site in Indonesia, which utilizes fast-growing Acacia trees that can rapidly change the land conditions. First, the UAS was utilized to collect high-resolution RGB imagery and multispectral images for the study area. The data were used to develop general land use/land cover (LULC) information for the site. Multispectral data were converted to various vegetation indices, and within the determined resolution grid (5, 10, 30 and 60 m), the fraction of each LULC type was analyzed for its correlation between the different vegetation indices (Vis). Finally, a simple empirical model was developed to estimate the FVC from the UAS data. The results show the correlation between the FVC (acacias) and different Vis ranging from R2 = 0.66–0.74, 0.76–0.8, 0.84–0.89 and 0.93–0.94 for 5, 10, 30 and 60 m grid resolutions, respectively. This study indicates that UAS-based FVC estimations can be used for observing fast-growing acacia trees at a fine scale resolution, which may assist current restoration programs in Indonesia. Full article
(This article belongs to the Special Issue UAVs for Vegetation Monitoring)
Show Figures

Graphical abstract

Review

Jump to: Research, Other

13 pages, 411 KiB  
Review
UAVs for Vegetation Monitoring: Overview and Recent Scientific Contributions
by Ana I. de Castro, Yeyin Shi, Joe Mari Maja and Jose M. Peña
Remote Sens. 2021, 13(11), 2139; https://0-doi-org.brum.beds.ac.uk/10.3390/rs13112139 - 29 May 2021
Cited by 66 | Viewed by 9814
Abstract
This paper reviewed a set of twenty-one original and innovative papers included in a special issue on UAVs for vegetation monitoring, which proposed new methods and techniques applied to diverse agricultural and forestry scenarios. Three general categories were considered: (1) sensors and vegetation [...] Read more.
This paper reviewed a set of twenty-one original and innovative papers included in a special issue on UAVs for vegetation monitoring, which proposed new methods and techniques applied to diverse agricultural and forestry scenarios. Three general categories were considered: (1) sensors and vegetation indices used, (2) technological goals pursued, and (3) agroforestry applications. Some investigations focused on issues related to UAV flight operations, spatial resolution requirements, and computation and data analytics, while others studied the ability of UAVs for characterizing relevant vegetation features (mainly canopy cover and crop height) or for detecting different plant/crop stressors, such as nutrient content/deficiencies, water needs, weeds, and diseases. The general goal was proposing UAV-based technological solutions for a better use of agricultural and forestry resources and more efficient production with relevant economic and environmental benefits. Full article
(This article belongs to the Special Issue UAVs for Vegetation Monitoring)
Show Figures

Graphical abstract

Other

Jump to: Research, Review

14 pages, 2140 KiB  
Letter
Recognition of Banana Fusarium Wilt Based on UAV Remote Sensing
by Huichun Ye, Wenjiang Huang, Shanyu Huang, Bei Cui, Yingying Dong, Anting Guo, Yu Ren and Yu Jin
Remote Sens. 2020, 12(6), 938; https://0-doi-org.brum.beds.ac.uk/10.3390/rs12060938 - 13 Mar 2020
Cited by 72 | Viewed by 8008
Abstract
Fusarium wilt (Panama disease) of banana currently threatens banana production areas worldwide. Timely monitoring of Fusarium wilt disease is important for the disease treatment and adjustment of banana planting methods. The objective of this study was to establish a method for identifying the [...] Read more.
Fusarium wilt (Panama disease) of banana currently threatens banana production areas worldwide. Timely monitoring of Fusarium wilt disease is important for the disease treatment and adjustment of banana planting methods. The objective of this study was to establish a method for identifying the banana regions infested or not infested with Fusarium wilt disease using unmanned aerial vehicle (UAV)-based multispectral imagery. Two experiments were conducted in this study. In experiment 1, 120 sample plots were surveyed, of which 75% were used as modeling dataset for model fitting and the remaining were used as validation dataset 1 (VD1) for validation. In experiment 2, 35 sample plots were surveyed, which were used as validation dataset 2 (VD2) for model validation. An UAV equipped with a five band multispectral camera was used to capture the multispectral imagery. Eight vegetation indices (VIs) related to pigment absorption and plant growth changes were chosen for determining the biophysical and biochemical characteristics of the plants. The binary logistic regression (BLR) method was used to assess the spatial relationships between the VIs and the plants infested or not infested with Fusarium wilt. The results showed that the banana Fusarium wilt disease can be easily identified using the VIs including the green chlorophyll index (CIgreen), red-edge chlorophyll index (CIRE), normalized difference vegetation index (NDVI), and normalized difference red-edge index (NDRE). The fitting overall accuracies of the models were greater than 80%. Among the investigated VIs, the CIRE exhibited the best performance both for the VD1 (OA = 91.7%, Kappa = 0.83) and VD2 (OA = 80.0%, Kappa = 0.59). For the same type of VI, the VIs including a red-edge band obtained a better performance than that excluding a red-edge band. A simulation of imagery with different spatial resolutions (i.e., 0.5-m, 1-m, 2-m, 5-m, and 10-m resolutions) showed that good identification accuracy of Fusarium wilt was obtained when the resolution was higher than 2 m. As the resolution decreased, the identification accuracy of Fusarium wilt showed a decreasing trend. The findings indicate that UAV-based remote sensing with a red-edge band is suitable for identifying banana Fusarium wilt disease. The results of this study provide guidance for detecting the disease and crop planting adjustment. Full article
(This article belongs to the Special Issue UAVs for Vegetation Monitoring)
Show Figures

Graphical abstract

13 pages, 9379 KiB  
Letter
Watson on the Farm: Using Cloud-Based Artificial Intelligence to Identify Early Indicators of Water Stress
by Daniel Freeman, Shaurya Gupta, D. Hudson Smith, Joe Mari Maja, James Robbins, James S. Owen, Jr., Jose M. Peña and Ana I. de Castro
Remote Sens. 2019, 11(22), 2645; https://0-doi-org.brum.beds.ac.uk/10.3390/rs11222645 - 13 Nov 2019
Cited by 22 | Viewed by 4659
Abstract
As demand for freshwater increases while supply remains stagnant, the critical need for sustainable water use in agriculture has led the EPA Strategic Plan to call for new technologies that can optimize water allocation in real-time. This work assesses the use of cloud-based [...] Read more.
As demand for freshwater increases while supply remains stagnant, the critical need for sustainable water use in agriculture has led the EPA Strategic Plan to call for new technologies that can optimize water allocation in real-time. This work assesses the use of cloud-based artificial intelligence to detect early indicators of water stress across six container-grown ornamental shrub species. Near-infrared images were previously collected with modified Canon and MAPIR Survey II cameras deployed via a small unmanned aircraft system (sUAS) at an altitude of 30 meters. Cropped images of plants in no, low-, and high-water stress conditions were split into four-fold cross-validation sets and used to train models through IBM Watson’s Visual Recognition service. Despite constraints such as small sample size (36 plants, 150 images) and low image resolution (150 pixels by 150 pixels per plant), Watson generated models were able to detect indicators of stress after 48 hours of water deprivation with a significant to marginally significant degree of separation in four out of five species tested (p < 0.10). Two models were also able to detect indicators of water stress after only 24 hours, with models trained on images of as few as eight water-stressed Buddleia plants achieving an average area under the curve (AUC) of 0.9884 across four folds. Ease of pre-processing, minimal amount of training data required, and outsourced computation make cloud-based artificial intelligence services such as IBM Watson Visual Recognition an attractive tool for agriculture analytics. Cloud-based artificial intelligence can be combined with technologies such as sUAS and spectral imaging to help crop producers identify deficient irrigation strategies and intervene before crop value is diminished. When brought to scale, frameworks such as these can drive responsive irrigation systems that monitor crop status in real-time and maximize sustainable water use. Full article
(This article belongs to the Special Issue UAVs for Vegetation Monitoring)
Show Figures

Graphical abstract

Back to TopTop