Next Article in Journal
Parameter Calibration of Pig Manure with Discrete Element Method Based on JKR Contact Model
Next Article in Special Issue
CHAP: Cotton-Harvesting Autonomous Platform
Previous Article in Journal
Multi-Bale Handling Unit for Efficient Logistics
Previous Article in Special Issue
A Plastic Contamination Image Dataset for Deep Learning Model Development and Training
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Mapping and Estimating Weeds in Cotton Using Unmanned Aerial Systems-Borne Imagery

1
Department of Soil and Crop Sciences, Texas A&M University, College Station, TX 77843, USA
2
Department of Mechanical Engineering, Texas A&M University, College Station, TX 77843, USA
3
Department of Aerospace Engineering, Texas A&M University, College Station, TX 77843, USA
*
Author to whom correspondence should be addressed.
Submission received: 10 May 2020 / Revised: 3 June 2020 / Accepted: 7 June 2020 / Published: 16 June 2020
(This article belongs to the Special Issue Feature Papers in Cotton Automation, Machine Vision and Robotics)

Abstract

:
In recent years, Unmanned Aerial Systems (UAS) have emerged as an innovative technology to provide spatio-temporal information about weed species in crop fields. Such information is a critical input for any site-specific weed management program. A multi-rotor UAS (Phantom 4) equipped with an RGB sensor was used to collect imagery in three bands (Red, Green, and Blue; 0.8 cm/pixel resolution) with the objectives of (a) mapping weeds in cotton and (b) determining the relationship between image-based weed coverage and ground-based weed densities. For weed mapping, three different weed density levels (high, medium, and low) were established for a mix of different weed species, with three replications. To determine weed densities through ground truthing, five quadrats (1 m × 1 m) were laid out in each plot. The aerial imageries were preprocessed and subjected to Hough transformation to delineate cotton rows. Following the separation of inter-row vegetation from crop rows, a multi-level classification coupled with machine learning algorithms were used to distinguish intra-row weeds from cotton. Overall, accuracy levels of 89.16%, 85.83%, and 83.33% and kappa values of 0.84, 0.79, and 0.75 were achieved for detecting weed occurrence in high, medium, and low density plots, respectively. Further, ground-truthing based overall weed density values were fairly correlated (r2 = 0.80) with image-based weed coverage assessments. Among the specific weed species evaluated, Palmer amaranth (Amaranthus palmeri S. Watson) showed the highest correlation (r2 = 0.91) followed by red sprangletop (Leptochloa mucronata Michx) (r2 = 0.88). The results highlight the utility of UAS-borne RGB imagery for weed mapping and density estimation in cotton for precision weed management.

1. Introduction

Weeds are the major pests of agricultural crops and a serious challenge to sustainable crop production [1,2]. A site-specific approach, which takes into account the spatio-temporal variabilities in weed species establishment and growth, can facilitate effective and economical weed management [3,4]. However, such an approach requires precise determination of different weed species infesting crop fields and their densities. Currently, weed infestation assessments are carried out by managers typically through manual weed scouting, which is often inefficient and inaccurate particularly in large production fields. Remote sensing has long been investigated as an alternative approach for weed infestation assessments across large areas [5,6,7]. Images acquired with remote sensing platforms can be analyzed to produce pixel-level or individual-plant-level information regarding the location and distribution of weeds throughout the field [7]. Such information can guide farmers in assessing weed infested areas and developing management grids for site-specific treatment. For example, Castaldi et al. [8] analyzed digital imageries to produce coverage maps for weeds in maize (Zea mays L.), which were further transformed into 2 m × 2 m grids. These grids were classified into weed-free and weed-infested zones based on weed threshold and subsequently utilized for site-specific weed management. Moreover, knowledge of weed distributions and densities across the field is beneficial for tailoring herbicide applications to weed spatial dynamics in the field [9].
Mapping weeds at early phenological growth stages requires higher spatial resolution imagery [7,10], although Borra-Serrano et al. [10] found that as coarse as 5 cm of spatial resolution may provide satisfactory results. Traditional remote sensing platforms such as satellite imageries generally do not meet the spatial resolution standards required for weed detection, and also have limited capability to provide real-time data. Nonetheless, they have been used to map patches of weed species at field scale [11,12]. Recent advancements in Unmanned Aerial Systems (UAS) technologies have opened up new opportunities for acquiring high spatial resolution imagery at desired temporal resolutions [13,14], and have been successfully used in various agricultural research areas, including soil and land mapping [15], drought stress monitoring [16,17], crop disease detection [18,19] and in-season yield estimation [20]. The opportunities have expanded with the ability to extract 3D point-clouds from the UAS-based high resolution imagery, which have been extensively used in various agricultural and forestry studies [21,22,23,24]. 3D point clouds can be analyzed to generate canopy height models and other 3D canopy metrics, which can be used as a supplement to spectral information. However, extraction of 3D information and processing of these metrics may be computationally complex and expensive. Over the years, with the increased ability to acquire imageries at sub-cm resolutions using UAS, several studies have experimented on the use of high-resolution imageries for weed mapping and classification [7,12,13,25,26,27,28,29]. However, there is still a lack of classification models to serve the unique needs of the farmer; thus, there is a strong need for developing case-specific classification models by further experimenting on different image analysis approaches.
With the rapid improvement in sensors, it has become possible to generate multi-band information, including thermal signatures, using UAS technology. These improvements have been adapted successfully in various agricultural applications, including weed detection and mapping. Due to the high ability of non-visible bands in discriminating plant species, several studies have used multispectral imagery [7,30] and hyperspectral imagery [31,32,33] for identifying and classifying weeds in crop fields and forest areas. The utility of thermal bands in plant stress detection [34,35] has paved the way for using thermal imagery in weed detection and mapping. However, these improved sensors are costly, and computation and processing of the data acquired from such sensors can be resource intensive. In many circumstances, these sensors may not prove to be a better option than the relatively cheap and computationally simple visible bands such as red, green, and blue bands (i.e., RGB bands) [7]. Visible bands have long been utilized for various agricultural applications and have a great potential for use in weed species detection and differentiation [10,36,37]. For example, more recently, Lottes et al. [38] classified weed species and sugarbeet using UAS-based RBG imagery, with an accuracy of 85% and 90%, respectively. Gao et al. [36] detected weed species in maize using UAS-derived RGB imagery and developed highly accurate predictive models for weed density estimation. The accuracy of the weed detection process largely depends upon the model and training features used for the user-defined classes, in addition to spectral and spatial resolution of the imageries. One of the most popular approaches utilized for image analysis is pixel-based approach. The traditional pixel-based approach involves the computational analysis at a pixel level and relies heavily upon spectral features, disregarding the potential for textural and spatial features to improve model accuracies [39]. However, the recent evolution of convolutional neural networks (CNNs), that take into account the spectral, textural and spatial features of images, allow for improved classification accuracies.
CNN-based studies for distinguishing between weeds and crops have been increasing in recent times. Bah et al. [40] used CNNs for detecting both intra- and inter-row weeds in spinach (Spinacia oleracea L.), beans (Phaseolus vulgaris L.), and beet (Beta vulgaris L.) in UAS-derived RGB images and achieved overall accuracies of 81%, 69%, and 93%, respectively. Using multispectral images, Sa et al. [30] developed a CNN-based model to segment crops and weeds from the soil background and achieved an overall accuracy of 82%. Several other studies have validated the effectiveness of CNN-based models; however, these models are data intensive and highly suffer from data inadequacy. For some cases, especially when dealing with binary problems such as the presence or absence of a weed, CNNs may not be necessary. Instead, adequate image analysis can be carried out using much simpler methods such as the object-based image analysis (OBIA) [39].
The OBIA allows for the generation of large number of image objects, which can be further classified into user-defined classes [41,42]. This approach was shown to be effective in mapping weeds in maize [26] and sunflower (Helianthus annuus L.) [28] when information such as crop row boundaries were fused with other set of features generated using OBIA. Such combined information would help segment inter-row weeds easily and minimize the misclassification instances during the classification process. However, the effectiveness of the fusion approach could differ with variable weed densities, which may lead to fuzzy boundaries between crop rows and proximal weeds. Furthermore, the false crop boundary delineation could result in classification errors as any vegetation pixels outside of the crop row is considered a weed, which may not be the case when crop leaves extend outside of the row.
It is important to investigate the effectiveness of the above-mentioned approach by testing it on areas with varying weed densities. Moreover, it is equally important to develop a post-classification model to refine the classification and minimize errors due to false crop row detection. In this study, an improved methodology that addresses these issues has been tested and demonstrated in cotton (Gossypium hirsutum L.), which is an important crop in Texas and parts of the Southern United States (US). The specific objectives of this study are to (1) test the effectiveness of the improved fusion method (OBIA and crop row detection) to map various densities of weed infestation in a cotton field using high resolution UAS-based RGB imagery, and (2) determine the relationship between weed pixel coverage and ground-based weed densities.

2. Materials and Methods

2.1. Study Site and Establishment

The study was conducted at the Texas A&M AgriLife Research farm near College Station, TX, US (30°32′15.75″ N, 96°25′19.50″ W; elevation: 68 m) (Figure 1). The cotton crop was drill seeded in 1-m wide rows on 1 June 2017. Palmer amaranth (Amaranthus palmeri S. Watson) and red sprangletop (Leptochloa mucronata Michx) seed were broadcast planted with three different densities (low, medium, and high) (see Table 1) in 10 × 10 m plots within a 0.6 ha of field. Palmer amaranth is an annual plant native to the arid southwestern U.S. and northwestern Mexico and is one of the most problematic weeds in row crop production in the U.S. due to evolution of multiple herbicide resistance in this species [43]. Red sprangletop is an annual grass weed, widespread in the southern U.S. cultivated lands. Other weed species present in the experimental area, though in low frequencies, include morningglories (Ipomoea spp.), Texas millet (Urochloa texana Buckl.), and devil’s claw (Proboscidea louisianica (Mill.) Thell.). Morningglory is a broadleaved, annual plant species with fast growth rates. Texas millet is an annual grass weed native to the southern U.S., and is a troublesome weed in row crops. Devil’s claw is an annual broadleaved weed that is commonly found across the sandy, arid areas of west and south Texas. The study had three replications and were arranged in a randomized complete block design. The steps followed in image acquisition and weed mapping are summarized in the flowchart (Figure 2).

2.2. Data Collection

Multiple flights were conducted over the experimental area from May to July 2017; however, only the image acquired on 28 June 2017 was used for analysis in the current study due to high image quality in an early to mid-crop growth stage. The multi-rotor UAS Phantom 4 (DJI, China) equipped with a 12 MP on-board camera was used for capturing images in three bands (Red, Green, and Blue). Six ground control points (GCPs) were laid out throughout the study area for georeferencing the imagery, and global positioning system (GPS) coordinates were recorded for the GCPs using EMLID-GNSS receiver (EMLID Inc., Hong Kong, China). Three different radiometric calibration panels (white, gray, and black) were placed on the ground to enable radiometric calibration of the imagery during ortho-mosaicking process. Image data were collected at 15 m above ground level (AGL), with the auto-exposure mode, 70% side and front overlapping rates, and forward UAV speed of 3 m/s. The flight was performed in a sunny day with wind speed of approximately 11 km h−1. A total of 464 images (.JPG format) were captured during the flight mission. The .JPG format was chosen over raw image format during image acquisition as the flight planner used in this study only supported the earlier format. Moreover, Pix4D Mapper, the software used for stitching the images, could not support the raw image format.
Ground truthing data on weed species density were documented at the time of flight operations. For each plot, five quadrats (1 m × 1 m) were laid out throughout the experimental area. For each quadrat, the number of individual plants per each species were counted and density m−2 were determined for comparing with image-based coverage area. Reflectance values for cotton and the weed species were recorded from the imagery to observe the spectral overlap (Figure 3) and choose the appropriate techniques for further image processing.

2.3. Image Mosaicking and Radiometric Calibration

The images were mosaicked using the Pix4D Mapper software (Pix4D Inc., Lausanne, Switzerland) (Figure 4a). The GPS coordinates for the GCPs were post corrected and used in the mosaicking process. Among the several templates available in Pix4D, the ‘Ag RGB’ template was chosen to process the imagery since this template is recommended for mosaicking RGB imagery (Pix4D manual). The key point image scale was set to ‘Full’ mode and minimum number of key point matching was set to ‘3’ for the point cloud densification in the template. The resulting ortho-mosaic imagery (Figure 4b) had a spatial and radiometric resolutions of 8 mm/pixel and 8 bits per pixel, respectively.
The digital number (DN) values were calibrated to reflectance values using the reflectance values of the spectral panel and their corresponding DN values in the imagery. Three different datasets, each with 300 DN values of a band as the X-variable and the reflectance values as the Y-variable belonging to the pixels in the spectral panel were prepared. Further, simple linear regression analyses were conducted to derive three separate regression models (Equations (1)–(3)) for predicting reflectance values using prepared datasets. The model was then applied to predict the values for all the pixels in red, blue, and green bands.
( σ j ) r   =   μ 1 ( λ j ) r + c 1
( σ j ) g   =   μ 2 ( λ j ) g + c 2
( σ j ) b   =   μ 3 ( λ j ) b + c 3
where
  • j)r = predicted reflectance value of a jth pixel for the red band
  • j)g = predicted reflectance value of a jth pixel for the green band
  • j)b = predicted reflectance value of a jth pixel for the blue band
  • j)r = DN value of a jth pixel for the red band
  • j)g = DN value of a jth pixel for the green band
  • j)b = DN value of a jth pixel for the blue band
μ1, μ2 and μ3 are slope values for red, green, and blue band, respectively, whereas c1, c2, and c3 are constants for models for red, green, and blue band, respectively.

2.4. Image Preprocessing

Image preprocessing is an important step in image analysis and is required to prepare the image for further analysis. The experimental plots in the imagery were clipped into individual subsets and were subjected to further image processing. This process was completed in the following four steps:

2.4.1. Masking Non-Vegetative Area

In order to avoid potential misclassification of target objects with unnecessary objects, it is a good approach to mask the non-vegetative area upfront. For this purpose, excess green vegetation index (ExG) [44] was calculated using Equation (4).
ExG = 2 G R B G + R + B
where G, R, and B indicate green, red, and blue channel pixel values, respectively.
The Otsu thresholding method [45] was applied to identify an optimal threshold for developing a binary classification: vegetation vs. non-vegetation (Figure 4c).

2.4.2. Canny Edge Filtering

The canny edge algorithm [46] was applied over the Otsu’s binary imagery to obtain the edges of the crop rows. The algorithm requires the user to input values for two different hyper parameters called “minVal” and “maxVal”. MinVal and maxVal represent the lower and upper limits of the intensity gradient range such that for any potential edge candidates to be regarded as true edge, the curve to which it belongs should lie either completely or partially above the upper limit within the user-defined range. Several sets of lower and upper values were tested in a trial and error mode until the best visual results were obtained. A median filter was applied over the edge imagery to remove edge noises and highlight crop rows (Figure 4d).

2.4.3. Hough Line Transformation

To minimize potential misclassification, the classification algorithm was applied only after separating inter-row weeds from cotton by detecting crop rows using one of the popular crop row detection method called Hough transformation [47]. This method determines positions of crop rows based on the parameters ρ and θ, where ρ is the perpendicular distance from the origin to the line and θ is the angle of perpendicular projection from the origin to the line, clockwise from the positive X-axis of the image space [47]. This method was implemented over the de-noised imagery using the “houghlineL” function in “OpenCV” package built in Python programming language to generate crop row lines (Figure 4e). The two hyper parameters for the function, ρ and θ, were chosen as 1000 and 0° to 180° respectively.

2.4.4. Generation of Crop-Row Strips

The row strip width (α) around each Hough line in each plot was determined using Equation (5), based on the width of cotton measured for 20 random plants within a plot.
α = i = 1 20 w i 20
where α denotes mean width, w represents the width of the cotton plant measured from a tip of a leaf on one side to a tip of a leaf on another side in the direction perpendicular to row axis for the ith plant (1 to 20).
A particular width value was then used to generate crop-row strips for each of the rows in a plot.

2.5. Weed Detection and Regression

Following the establishment of row strips, the OBIA framework was implemented for both intra- and inter-row weed detection using eCognition Developer software (Trimble Inc., Munich, Germany). The chessboard segmentation analysis was then carried out over each plot imagery to produce grids of 5 × 5 pixels, which represented 4 cm × 4 cm area on the ground. Any grids pertaining to vegetation outside of the Hough transformation-derived strips (Figure 4f) were classified as inter-row weeds. After assigning the inter-row weeds, the next step was to classify the grids within a strip into cotton and intra-row weeds, as grids pertaining to soil/shadows had already been masked during image preprocessing steps. For this purpose, the Random Forest (RF) method [48], a non-parametric ensemble learning method, was used. This classifier creates a set of decision trees from a randomly selected subset of training dataset, which then aggregates the votes from different decision trees to decide the final class of the test object. The outcome in each decision tree is determined based on information gain, gain ratio, and Gini index [48] for each attribute or feature. This classifier requires two hyper parameters ‘ntree’ (the number of decision trees to be formed during the decision process) and ‘mtry’ (the number of features to be used in the node for a decision tree) to be set by the user; in this study, ‘ntree’ and ‘mtry’ were set to ‘500’ and ‘the square root of total number of image features used in the classification’, respectively.
A total of 18 grey level co-occurrence matrix (GLCM)-based textural features [49] and five spectral features were constructed for clipped imageries (Table 2). A balanced sample size of 600 for each class (cotton and weed species) were selected randomly from the grid objects resulting from the chessboard segmentation to train the RF classifier. In addition, 200 samples for each class (cotton, weeds, and soil/shadows) were used for validation of mapping in each density treatments. Prior to the training process, it was necessary to discard the non-important features to optimize the computation cost and time. The “varImp.randomForest” function in “Caret” package in R programming language (R Foundation for Statistical Computing, Vienna, Austria) was used to compute the importance index for the features constructed in the study. The function uses the RF classifier-based wrapper selection method to calculate accuracy of each decision tree using out-of-bag samples for a given feature. The decrease in accuracy of decision tree when a feature is substituted with another feature is averaged across all the decision trees to calculate mean decrease in accuracy, which is further rescaled to 1–100 and termed as important index. In general, higher the mean decrease in accuracy for a feature, higher the important index and better the feature. In this study, the features with index value greater than 50 in the scale of 1–100 were chosen, which includes two spectral features (red band and ExG) and four textural features (GLCM_Homogeneity for green band, GLCM_Contrast for green and red band, and GLCM_Entropy for red band).
In certain cases, leaves of inter-row weeds overlapped with that of cotton within the strip and could not be classified using the standard approach. In such cases, an iterative feature ratio rule was used to re-label the mis-labeled vegetation as weeds using Equation (6).
φ ( x ) = { 1 , 0.9 < E x G i j < 1.1 0 , Otherwise }
where φ ( x ) is the rule to assign the specific grid (j) as weed (1) or non-weed (0), E x G i j represents the ratio of ExG value of the grid for weeds immediately outside the row strip (i) to that of the grid immediately inside the row strip (j).
Two model accuracy measures, namely overall accuracy (OA) and kappa values (K), were calculated. The OA is calculated using Equation (7) as the number of correctly classified grid objects over the total number of validation samples.
OA   ( % ) = A + E + I A + B + C + D + E + F + G + H + I × 100
where A, E, and I are the number of validation samples accurately classified as crop, weed, and soil/shadows, respectively; D and G are the number of crop samples that were inaccurately classified as weed and soil/shadow, respectively; B and H are the number of weed samples that were inaccurately classified as crop and soil/shadow; C and E are the number of soil/shadow samples that were inaccurately classified as crop and weed, respectively.
The Kappa value is a measure of deviation from the outcome by chance. The value ranges from 0 to 1, with 0 indicating no agreement and 1 indicating full agreement between the observed and predicted values. The mathematical formula for computing kappa values was derived from Cohen [50].
Following the generation of classification maps, the shapefiles for weed classes were extracted from the maps and loaded to the ArcMap software (ESRI Inc., Redlands, CA, USA). The area of all the shapefiles belonging to a particular quadrat (1 m × 1 m) was calculated, divided by the area of the quadrat, and recorded as image-derived weed coverage (%). A simple linear regression analysis was subsequently conducted in the R statistical software (R core team 2013) using the ground-based weed density (m−2) dataset as the Y-variable and the image-derived weed coverage (%) dataset as the X-variable. In addition to the model for combined weed species in the quadrats, two separate individual models for two most dominant species in the experimental area, Palmer amaranth and red sprangletop were developed. For this purpose, quadrats (9 for Palmer amaranth and 11 for red sprangletop) that had >80% infestation of either species in the field were selected across the experimental area.
The whole data processing tasks were performed using a computer with a relatively high processing power comprising of Intel® core™ i7-5960X 3.00 gigahertz (GHz) central processing unit, 64 gigabytes of random-access memory (RAM), and 64-bit operating system. The data processing tasks, including image mosaicking, calibration, preprocessing, and weed detection and regression for all the density treatment plots takes approximately 2 h and 30 min. This includes only the time required for running the python scripts/software for corresponding data processing tasks and does not include the time associated with the preparation of necessary data for tasks such as image calibration, and training and validation of the classifier. The image mosaicking process consumed the majority of the time (approx. 70%), whereas the image preprocessing step required the least time (5%).

3. Results and Discussion

3.1. Weed Mapping

The classification techniques used in the current study was effective in mapping the distribution of weeds in cotton with reasonably high accuracy levels. An overall accuracy (OA) of 89.16%, 85.83%, and 83.33% and kappa (K) value of 0.84, 0.79, or 0.75 were observed for low, medium, or high-density plots, respectively (Figure 5). The generally high accuracy levels obtained in the current study could be attributed to the implementation of the multi-step classification model wherein potential misclassification was minimized by first detecting and separating inter-row weeds, thereby subjecting only the intra-row weeds for machine learning-based classification. The classified maps and pixel density heat maps are shown in Figure 6.
Several studies have undertaken a similar multi-step classification approach and achieved high classification accuracies. For example, De Castro et al. [28] mapped both broadleaved and grass weed species in sunflower and cotton fields and achieved an average weed detection accuracy index of 73% and 75% for cotton and sunflower field, respectively. López-Granados et al. [7] mapped johnsongrass (Sorghum halepense (L.) Pers.) in maize (Zea mays L.) using a multistep approach, wherein the maize rows were first delineated using an iterative strip formation process, the inter-row johnsongrass was detected, and then normalized difference vegetation index (NDVI) and Excess Greenness Index (ExG) were used to classify intra-row johnsongrass from maize with an accuracy of 89% and 82% for multispectral and visual imagery, respectively. In another study, Gao et al. [36] combined the pixel-based method with OBIA to map weeds in a maize crop; Hough transformation was followed by Random Forest (RF) classifier using spectral, GLCM-based textural, and geometrical features to classify intra-row weeds with OA and K values of 94.5% and 0.91, respectively.
A unique aspect of the current study is that it implemented the multi-step approach under three different weed density levels and provided an outlook on how the accuracies are affected. Moreover, the majority of existing studies have focused on classifying weeds at an early crop stage when the weeds are sparsely distributed or there is a clear delineation of crop rows due to an absence of overlapping intra-row weeds. Such scenarios would minimize the complexity of generating crop row lines and further image processing tasks, leading to high classification accuracies. However, the current study successfully classified intra- and inter-row weeds even under high density levels.
The robustness of classification of inter-row weeds was dependent on how accurately the crop rows were delineated [7,36] using the crop row detection method. In the current study, this method was very effective, given the straight cotton rows in the field. However, additional processing may be required to remove redundant crop row lines in cases of non-linearity or under very high weed density levels wherein the green pixels of inter-row weeds may overlap with crop pixels, making it difficult for the algorithm to identify edges of rows. Apart from the field structure, crop row detection can also be influenced by the noises during binarization, edge detection, and other related preprocessing steps [51]. One of the several reasons for the inaccurate binarization could be the higher spectral similarity between shadow and underexposed vegetation pixels. The lower leaves are affected by shadow from upper leaves, which could lead to spectral confusion between underexposed leaves and gaps in crop canopy. The inaccurate masking of crop pixels, especially at the edge of the crop rows, might significantly affect crop row detection results and thus may lead to inaccurate crop row lines.
Low weed density plots had low instances of inter-row weeds and extremely low intra-row weeds. However, intra-row weeds were relatively frequent in the medium and high-density plots. Lower OA in high-density plots, as compared to low-density plots, could be attributed to frequent occurrence of intra-row weeds in high density plots and difficulty associated with classifying intra-row weeds from cotton using the OBIA method. High instances of intra-row weeds increased the risk of spectral similarity and obscure the textural uniqueness of plants due to canopy interlockings. This situation was found in every replication of the high-density plots. The standard deviations of OA and kappa for different density plots were low and quite similar, probably due to high similarity in spatial configuration and amount of inter-row and intra-row weeds in cotton. In addition, segmented objects (group of pixels) were used as the validation samples, in contrast to pixels, which may have lowered the chances of variability in accuracy measures.
Morningglories showed high spectral (all three bands) and textural confusion with cotton, compared to Palmer amaranth and red sprangletop. The morningglories were often seen creeping into the cotton rows in the ground, adding more complexity in spectral distinction. Red sprangletop in particular had very low spectral overlap with cotton, compared to other weed species. These weeds were only the major grass species found in the area and were visually distinct in the ground. Though Palmer amaranth had high spectral similarity with cotton, the GLCM-based textural attributes based on 5 × 5 pixel kernels were different for these species, as implied by differences in leaf sizes and canopy structure. Thus, a combination of spectral and textural features yields high classification accuracies [52,53]. However, it should also be noted that spectral similarity observed here between cotton and weeds may not be the case in other situations. The spectral confusion is primarily dependent on the spectral, spatial, and radiometric resolution of the imagery; growth stage of the crop and weed; and the crop production system. For example, inclusion of non-visible bands in the analysis may help increase the spectral separability between crops and weeds. Further, increase in radiometric resolution adds more gradients of pixel values and thus provides more details/information at the pixel level.

3.2. Relationship between Weed Density and Pixel Coverage

The relationship between the image-based weed coverage data (i.e., area covered by weed pixels) and ground-based weed density assessments (plants m−2) for total weed species, and individually for Palmer amaranth and red sprangletop was determined using a simple linear regression analysis. A fairly high coefficient of determination (r2 = 0.80) was achieved for total weed species present in a quadrat, indicating that the density of weeds in the crop field could be estimated based on weed pixel coverage (Figure 7). The ability for assessing weed densities using aerial images has been demonstrated previously. For example, Gao et al. [36] obtained, using a very high-resolution imagery (1.78 mm/pixel), high coefficient of determination (r2 = 0.89) between image-based weed density and manually assessed weed density in a maize field. Although the coefficient of determination in the current study (r2 = 0.80) was slightly lower compared to that of Gao et al. [36], current study provided considerably high accuracy even with a coarser spatial resolution (8 mm/pixel).
The degree of relationship between ground-based manual assessments and aerial image-based assessments can be affected by several factors including weed and crop species being studied, growth stages of weeds, and environmental factors. First, the accuracy of weed pixel coverage determination in aerial imagery depends on how well the weeds are classified and distinguished from the crop. Second, inter-locked growth of weed species can lead to inaccurate estimation of pixel coverage since the coverage area of two interlocking weed plants may be lesser than the actual value. It is also possible that the dominant weeds can partially or completely mask other species growing underneath them, affecting the total pixel values [53,54]. Among the individual weed species assessed in the present study, high coefficient of determination was achieved for red sprangletop (r2 = 0.88) (Figure 8a) and Palmer amaranth (r2 = 0.91) (Figure 8b). Higher accuracies with Palmer amaranth compared to that of red sprangletop could be largely attributed to the differences in growth pattern between the two species; red sprangletop plants had higher overlapping with each other, with more variable growth sizes compared to Palmer amaranth.
In this study, we aimed at demonstrating if and how UAS can be used to map different densities of early- to mid-season weeds in cotton and estimate their densities. We do not anticipate our regression models to be as accurate under alternative experimental settings (e.g., different flight height, growth stage of weeds, forward speed, lighting conditions, etc.). Rather, we anticipate that the study informs that true color UAV images can indeed be used to map early- to mid-season weeds. Nonetheless, we are confident that our methodology can be adopted and expanded by other studies with a similar focus. The prime reason for this adaptability is that the supervised classification method implemented in this study is based on the local training data. The regression model for weed estimation depends upon the classification-based weed canopy coverage area, which in turn depends upon the training samples collected by the researcher. This whole process is similar for any weed size in cotton. Following points highlight the significance of the experimental plan and the outputs of this study:
(a)
The study has demonstrated if and how early- to mid-season weeds can be mapped in cotton using true color UAS-borne imagery.
(b)
The study has shown that vegetation indices such as excess greenness index and textural features can be used in mapping early- to mid-season weeds, at least for high spatial resolution true color imagery. This information can guide future researchers with shared ideas.
(c)
The study has illustrated that high spatial resolution true color imagery-based weed coverage area could be an effective determinant of weed density in cotton at early- to mid-growth stage of weeds.
(d)
The study has also demonstrated how high spatial resolution imagery can be utilized to detect early- to mid-season cotton rows and use the information to easily segment out inter-row weeds.

4. Conclusions

This study demonstrated a methodology for mapping weed infestations in cotton utilizing RGB imagery and non-conventional image analysis techniques. Advanced computer vision techniques were tested to map weeds under different density levels and determine the relationship between image-based weed coverage estimates and ground-based weed density assessments. The current study has successfully demonstrated that they can be applied across different levels of weed densities. The spatial maps and density prediction models can be great resources for farmers/consultants for robust assessment of weed infestations and making informed management decisions. Furthermore, with a successful application of RGB imagery for this purpose, the study also emphasizes the usefulness of RGB imagery for weed assessment.
This study, however, has few limitations: (a) the results presented here were based on the experiment carried out on a specific weed growth stage (Table 1) and thus the predictive model for weed densities may not be applicable to other weed growth stages. However, our study proved the effectiveness of the computer vision techniques in weed density assessments, and this approach can be expanded to other scenarios as well. (b) the quadrats used for regression analysis for individual weed species were selected such that the specific weed densities were >80% within each quadrat. This was necessary due to difficulties with distinguishing different weed species using RGB imagery at this level of image resolution (8 mm/pixel). Such high densities of a single weed species may not be typical in all field scenarios and occurrence of a mix of multiple weed species can complicate prediction accuracies. However, ongoing technological improvements may improve weed classification and provide a solution to this challenge. Future research should focus on utilizing multispectral and hyperspectral imageries and develop improved classification algorithms for weed infestation assessments.

Author Contributions

M.B. conceptualized the research, V.S. and M.B. designed the field experiments; V.S. conducted the field study; D.C. and J.V. assisted with aerial data collection and interpretation; B.S. conducted data analysis; B.S. and V.S. wrote the manuscript; all authors edited and revised the manuscript. All authors have read and agreed to the published version of the manuscript.

Funding

Funding for this research was provided by Texas A&M AgriLife Research—Cropping Systems Seed Grant Program (#111222), and Cotton Incorporated (#19-255).

Acknowledgments

Authors would like to acknowledge Cody Bagnall and Ian Gates for conducting UAS flights, Misty Miles for coordinating flight operations, and Seth Abugho for assistance with weed species establishment. Funding for this research was provided by Texas A&M AgriLife Research—Cropping Systems Seed Grant Program.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Hall, M.R.; Swanton, C.J.; Anderson, G.W. The Critical Period of Weed Control in Grain Corn (Zea mays). Weed Sci. 1992, 40, 441–447. [Google Scholar] [CrossRef]
  2. Oerke, E.-C. Crop losses to pests. J. Agric. Sci. 2005, 144, 31–43. [Google Scholar] [CrossRef]
  3. Aracena, P.A. Spatially-Explicit Decision Support System for Invasive Weed Species Management. Ph.D. Thesis, University of Montana, Missoula, MT, USA, 2013. [Google Scholar]
  4. Van Evert, F.K.; Fountas, S.; Jakovetic, D.; Crnojevic, V.; Travlos, I.; Kempenaar, C. Big Data for weed control and crop protection. Weed Res. 2017, 57, 218–233. [Google Scholar] [CrossRef] [Green Version]
  5. Lamb, D.; Brown, R. PA—Precision Agriculture. J. Agric. Eng. Res. 2001, 78, 117–125. [Google Scholar] [CrossRef]
  6. Everitt, J.H.; Fletcher, R.S.; Elder, H.S.; Yang, C. Mapping giant salvinia with satellite imagery and image analysis. Environ. Monit. Assess. 2007, 139, 35–40. [Google Scholar] [CrossRef] [PubMed]
  7. Lopez-Granados, F.; Torres-Sánchez, J.; Serrano-Pérez, A.; De Castro, A.I.; Mesas-Carrascosa, F.-J.; Peña-Barragan, J.M.; Mesas-Carrascosa, F.-J. Early season weed mapping in sunflower using UAV technology: Variability of herbicide treatment maps against weed thresholds. Precis. Agric. 2015, 17, 183–199. [Google Scholar] [CrossRef]
  8. Castaldi, F.; Pelosi, F.; Pascucci, S.; Casa, R. Assessing the potential of images from unmanned aerial vehicles (UAV) to support herbicide patch spraying in maize. Precis. Agric. 2016, 18, 76–94. [Google Scholar] [CrossRef]
  9. Goudy, H.J.; Bennett, K.A.; Brown, R.B.; Tardif, F.J. Evaluation of site-specific weed management using a direct-injection sprayer. Weed Sci. 2001, 49, 359–366. [Google Scholar] [CrossRef]
  10. Borra-Serrano, I.; Peña-Barragan, J.M.; Torres-Sánchez, J.; Mesas-Carrascosa, F.-J.; Lopez-Granados, F. Spatial Quality Evaluation of Resampled Unmanned Aerial Vehicle-Imagery for Weed Mapping. Sensors 2015, 15, 19688–19708. [Google Scholar] [CrossRef] [PubMed]
  11. De Castro, A.I.; Lopez-Granados, F.; Jurado-Expósito, M. Broad-scale cruciferous weed patch classification in winter wheat using QuickBird imagery for in-season site-specific control. Precis. Agric. 2013, 14, 392–413. [Google Scholar] [CrossRef] [Green Version]
  12. Castillejo-González, I.L.; Peña-Barragan, J.M.; Jurado-Expósito, M.; Mesas-Carrascosa, F.-J.; Lopez-Granados, F. Evaluation of pixel- and object-based approaches for mapping wild oat (Avena sterilis) weed patches in wheat fields using QuickBird imagery for site-specific management. Eur. J. Agron. 2014, 59, 57–66. [Google Scholar] [CrossRef]
  13. Shi, Y.; Thomasson, J.A.; Murray, S.C.; Pugh, N.A.; Rooney, W.L.; Shafian, S.; Rajan, N.; Rouze, G.; Morgan, C.L.S.; Neely, H.L.; et al. Unmanned Aerial Vehicles for High-Throughput Phenotyping and Agronomic Research. PLoS ONE 2016, 11, e0159781. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  14. Singh, V.; Rana, A.; Bishop, M.; Filippi, A.M.; Cope, D.; Rajan, N.; Bagavathiannan, M. Unmanned aircraft systems for precision weed detection and management: Prospects and challenges. In Advances in Agronomy; Elsevier BV: Amsterdam, The Netherlands, 2020; Volume 159, pp. 93–134. [Google Scholar]
  15. Akar, Özlem Mapping land use with using Rotation Forest algorithm from UAV images. Eur. J. Remote Sens. 2017, 50, 269–279. [CrossRef] [Green Version]
  16. Hoffmann, H.; Jensen, R.; Thomsen, A.; Nieto, H.; Rasmussen, J.; Friborg, T. Crop water stress maps for an entire growing season from visible and thermal UAV imagery. Biogeosciences 2016, 13, 6545–6563. [Google Scholar] [CrossRef] [Green Version]
  17. Ludovisi, R.; Tauro, F.; Salvati, R.; Khoury, S.; Scarascia, G.M.; Harfouche, A.L. UAV-Based Thermal Imaging for High-Throughput Field Phenotyping of Black Poplar Response to Drought. Front. Plant Sci. 2017, 8, 8. [Google Scholar] [CrossRef]
  18. Mirik, M.; Jones, D.C.; Price, J.; Workneh, F.; Ansley, R.J.; Rush, C.M. Satellite Remote Sensing of Wheat Infected by Wheat streak mosaic virus. Plant Dis. 2011, 95, 4–12. [Google Scholar] [CrossRef] [Green Version]
  19. Sugiura, R.; Tsuda, S.; Tamiya, S.; Itoh, A.; Nishiwaki, K.; Murakami, N.; Shibuya, Y.; Hirafuji, M.; Nuske, S. Field phenotyping system for the assessment of potato late blight resistance using RGB imagery from an unmanned aerial vehicle. Biosyst. Eng. 2016, 148, 1–10. [Google Scholar] [CrossRef]
  20. Swain, K.C.; Thomson, S.J.; Jayasuriya, H.P.W. Adoption of an Unmanned Helicopter for Low-Altitude Remote Sensing to Estimate Yield and Total Biomass of a Rice Crop. Trans. ASABE 2010, 53, 21–27. [Google Scholar] [CrossRef] [Green Version]
  21. Torres-Sánchez, J.; De Castro, A.I.; Peña-Barragan, J.M.; Jiménez-Brenes, F.M.; Arquero, O.; Lovera, M.; Lopez-Granados, F. Mapping the 3D structure of almond trees using UAV acquired photogrammetric point clouds and object-based image analysis. Biosyst. Eng. 2018, 176, 172–184. [Google Scholar] [CrossRef]
  22. Comba, L.; Biglia, A.; Aimonino, D.R.; Tortia, C.; Mania, E.; Guidoni, S.; Gay, P. Leaf Area Index evaluation in vineyards using 3D point clouds from UAV imagery. Precis. Agric. 2019, 1–16. [Google Scholar] [CrossRef] [Green Version]
  23. Mesas-Carrascosa, F.-J.; De Castro, A.I.; Torres-Sánchez, J.; Tarradas, P.T.; Jiménez-Brenes, F.M.; García-Ferrer, A.; Lopez-Granados, F. Classification of 3D Point Clouds Using Color Vegetation Indices for Precision Viticulture and Digitizing Applications. Remote Sens. 2020, 12, 317. [Google Scholar] [CrossRef] [Green Version]
  24. Zermas, D.; Morellas, V.; Mulla, D.J.; Papanikolopoulos, N. 3D model processing for high throughput phenotype extraction–the case of corn. Comput. Electron. Agric. 2020, 172, 105047. [Google Scholar] [CrossRef]
  25. Louargant, M.; Jones, G.; Faroux, R.; Paoli, J.-N.; Maillot, T.; Gee, C.; Villette, S. Unsupervised Classification Algorithm for Early Weed Detection in Row-Crops by Combining Spatial and Spectral Information. Remote Sens. 2018, 10, 761. [Google Scholar] [CrossRef] [Green Version]
  26. Peña-Barragan, J.M.; Torres-Sánchez, J.; De Castro, A.I.; Kelly, M.; Lopez-Granados, F. Weed Mapping in Early-Season Maize Fields Using Object-Based Analysis of Unmanned Aerial Vehicle (UAV) Images. PLoS ONE 2013, 8, e77151. [Google Scholar] [CrossRef] [Green Version]
  27. Rasmussen, J.; Nielsen, J.; Garcia-Ruiz, F.; Christensen, S.; Streibig, J.C. Potential uses of small unmanned aircraft systems (UAS) in weed research. Weed Res. 2013, 53, 242–248. [Google Scholar] [CrossRef]
  28. De Castro, A.I.; Torres-Sánchez, J.; Peña-Barragan, J.M.; Jiménez-Brenes, F.M.; Csillik, O.; Lopez-Granados, F. An Automatic Random Forest-OBIA Algorithm for Early Weed Mapping between and within Crop Rows Using UAV Imagery. Remote Sens. 2018, 10, 285. [Google Scholar] [CrossRef] [Green Version]
  29. Mink, R.; Dutta, A.; Peteinatos, G.; Sökefeld, M.; Engels, J.J.; Hahn, M.; Gerhards, R. Multi-Temporal Site-Specific Weed Control of Cirsium arvense (L.) Scop. and Rumex crispus L. in Maize and Sugar Beet Using Unmanned Aerial Vehicle Based Mapping. Agricultural 2018, 8, 65. [Google Scholar] [CrossRef] [Green Version]
  30. Sa, I.; Popović, M.; Khanna, R.; Chen, Z.; Lottes, P.; Liebisch, F.; Nieto, J.; Stachniss, C.; Walter, A.; Siegwart, R. WeedMap: A Large-Scale Semantic Weed Mapping Framework Using Aerial Multispectral Imaging and Deep Neural Network for Precision Farming. Remote Sens. 2018, 10, 1423. [Google Scholar] [CrossRef] [Green Version]
  31. Mirik, M.; Ansley, R.J.; Steddom, K.; Jones, D.C.; Rush, C.M.; Michels, G.J.; Elliott, N.C. Remote Distinction of A Noxious Weed (Musk Thistle: CarduusNutans) Using Airborne Hyperspectral Imagery and the Support Vector Machine Classifier. Remote Sens. 2013, 5, 612–630. [Google Scholar] [CrossRef] [Green Version]
  32. Atkinson, J.T.; Ismail, R.; Robertson, M. Mapping Bugweed (Solanum mauritianum) Infestations in Pinus patula Plantations Using Hyperspectral Imagery and Support Vector Machines. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2013, 7, 17–28. [Google Scholar] [CrossRef]
  33. Gao, J.; Nuyttens, D.; Lootens, P.; He, Y.; Pieters, J.G. Recognising weeds in a maize crop using a random forest machine-learning algorithm and near-infrared snapshot mosaic hyperspectral imagery. Biosyst. Eng. 2018, 170, 39–50. [Google Scholar] [CrossRef]
  34. Calderón, R.; Navas-Cortés, J.A.; Zarco-Tejada, P.J. Early Detection and Quantification of Verticillium Wilt in Olive Using Hyperspectral and Thermal Imagery over Large Areas. Remote Sens. 2015, 7, 5584–5610. [Google Scholar] [CrossRef] [Green Version]
  35. Sankaran, S.; Maja, J.M.; Buchanon, S.; Ehsani, R. Huanglongbing (Citrus Greening) Detection Using Visible, Near Infrared and Thermal Imaging Techniques. Sensors 2013, 13, 2117–2130. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  36. Gao, J.; Liao, W.; Nuyttens, D.; Lootens, P.; Vangeyte, J.; Panić, M.; He, Y.; Pieters, J.G. Fusion of pixel and object-based features for weed mapping using unmanned aerial vehicle imagery. Int. J. Appl. Earth Obs. Geoinf. 2018, 67, 43–53. [Google Scholar] [CrossRef]
  37. Huang, H.; Deng, J.; Lan, Y.; Yang, A.; Deng, X.; Zhang, L. A fully convolutional network for weed mapping of unmanned aerial vehicle (UAV) imagery. PLoS ONE 2018, 13, e0196302. [Google Scholar] [CrossRef] [Green Version]
  38. Lottes, P.; Khanna, R.; Pfeifer, J.; Siegwart, R.; Stachniss, C. UAV-based crop and weed classification for smart farming. In Proceedings of the 2017 IEEE International Conference on Robotics and Automation (ICRA); Institute of Electrical and Electronics Engineers (IEEE), Marina Bay, Singapore, 29 May–3 June 2017; pp. 3024–3031. [Google Scholar]
  39. Blaschke, T. Object based image analysis for remote sensing. ISPRS J. Photogramm. Remote Sens. 2010, 65, 2–16. [Google Scholar] [CrossRef] [Green Version]
  40. Bah, M.D.; Dericquebourg, E.; Hafiane, A.; Canals, R. Deep Learning Based Classification System for Identifying Weeds Using High-Resolution UAV Imagery. In Proceedings of the Advances in Intelligent Systems and Computing; Springer Science and Business Media LLC: Berlin, Germany, 2018; pp. 176–187. [Google Scholar]
  41. Sapkota, B.B.; Liang, L. A multistep approach to classify full canopy and leafless trees in bottomland hardwoods using very high-resolution imagery. J. Sustain. For. 2017, 37, 339–356. [Google Scholar] [CrossRef]
  42. Sapkota, B.B.; Liang, L. High-Resolution Mapping of Ash (Fraxinus spp.) in Bottomland Hardwoods to Slow Emerald Ash Borer Infestation. Sci. Remote Sens. 2020, 1, 100004. [Google Scholar] [CrossRef]
  43. Ward, S.M.; Webster, T.; Steckel, L.E. Palmer Amaranth (Amaranthus palmeri): A Review. Weed Technol. 2013, 27, 12–27. [Google Scholar] [CrossRef] [Green Version]
  44. Woebbecke, D.M.; Meyer, G.E.; Von Bargen, K.; Mortensen, D.A. Color Indices for Weed Identification Under Various Soil, Residue, and Lighting Conditions. Trans. ASAE 1995, 38, 259–269. [Google Scholar] [CrossRef]
  45. Otsu, N. A Threshold Selection Method from Gray-Level Histograms. IEEE Trans. Syst. Man, Cybern. 1979, 9, 62–66. [Google Scholar] [CrossRef] [Green Version]
  46. Canny, J. A Computational Approach to Edge Detection. IEEE Trans. Pattern Anal. Mach. Intell. 1986, 679–698. [Google Scholar] [CrossRef]
  47. Slaughter, D.C.; Giles, D.; Downey, D. Autonomous robotic weed control systems: A review. Comput. Electron. Agric. 2008, 61, 63–78. [Google Scholar] [CrossRef]
  48. Breiman, L. Bagging predictors. Mach. Learn. 1996, 24, 123–140. [Google Scholar] [CrossRef] [Green Version]
  49. Haralick, R.M.; Shanmugam, K.; Dinstein, I. Textural Features for Image Classification. IEEE Trans. Syst. Man Cybern. 1973, 3, 610–621. [Google Scholar] [CrossRef] [Green Version]
  50. Cohen, J. A Coefficient of Agreement for Nominal Scales. Educ. Psychol. Meas. 1960, 20, 37–46. [Google Scholar] [CrossRef]
  51. Gée, C.; Bossu, J.; Jones, G.; Truchetet, F. Crop/weed discrimination in perspective agronomic images. Comput. Electron. Agric. 2008, 60, 49–59. [Google Scholar] [CrossRef]
  52. Wu, L.; Wen, Y. Weed/corn seedling recognition by support vector machine using texture features. Afr. J. Agric. Res. 2009, 4, 840–846. [Google Scholar]
  53. Lin, F.; Zhang, D.; Huang, Y.; Wang, X.; Chen, X. Detection of Corn and Weed Species by the Combination of Spectral, Shape and Textural Features. Sustainability 2017, 9, 1335. [Google Scholar] [CrossRef] [Green Version]
  54. Connolly, J.; Sebastià, M.-T.; Kirwan, L.; Finn, J.; Llurba, R.; Suter, M.; Collins, R.P.; Porqueddu, C.; Helgadóttir, Á.; Baadshaug, O.H.; et al. Weed suppression greatly increased by plant diversity in intensively managed grasslands: A continental-scale experiment. J. Appl. Ecol. 2017, 55, 852–862. [Google Scholar] [CrossRef]
Figure 1. The experimental field (0.6 ha) with spatial distribution of treatment plots representing low (green polygons), medium (blue), and high (red) weed densities. Yellow star within the density plots represent the location for experimental unit, which is a quadrat (1 m × 1 m) in our case. Each treatment plot has five experimental units.
Figure 1. The experimental field (0.6 ha) with spatial distribution of treatment plots representing low (green polygons), medium (blue), and high (red) weed densities. Yellow star within the density plots represent the location for experimental unit, which is a quadrat (1 m × 1 m) in our case. Each treatment plot has five experimental units.
Agriengineering 02 00024 g001
Figure 2. Flowchart for the overall methodology followed in this research for mapping weeds in a cotton field. The specific steps included (shown in dashed boxes) are: (a) data collection, (b) image mosaicking, (c) image preprocessing, and (d) weed detection and regression.
Figure 2. Flowchart for the overall methodology followed in this research for mapping weeds in a cotton field. The specific steps included (shown in dashed boxes) are: (a) data collection, (b) image mosaicking, (c) image preprocessing, and (d) weed detection and regression.
Agriengineering 02 00024 g002
Figure 3. Reflectance values for the three most dominant weed species in the experimental area, compared with cotton for red, green, and blue bands in the visual imagery.
Figure 3. Reflectance values for the three most dominant weed species in the experimental area, compared with cotton for red, green, and blue bands in the visual imagery.
Agriengineering 02 00024 g003
Figure 4. Various stages of image pre-processing: (a) Loading raw images in Pix4D software for mosaicking, (b) Clipping RGB imagery pertaining to each 10 m × 10 m treatment plot, (c) Otsu-thresholding, (d) Applying canny-edge algorithm and median filtering, (e) Generating Hough lines over the RGB imagery, and (f) Creating strips around Hough lines (green lines); here red pixels represent inter-row weeds and black pixels represent soil/shadows.
Figure 4. Various stages of image pre-processing: (a) Loading raw images in Pix4D software for mosaicking, (b) Clipping RGB imagery pertaining to each 10 m × 10 m treatment plot, (c) Otsu-thresholding, (d) Applying canny-edge algorithm and median filtering, (e) Generating Hough lines over the RGB imagery, and (f) Creating strips around Hough lines (green lines); here red pixels represent inter-row weeds and black pixels represent soil/shadows.
Agriengineering 02 00024 g004
Figure 5. Accuracy measures for various levels (low, medium, and high) of weed densities established in the experiment.
Figure 5. Accuracy measures for various levels (low, medium, and high) of weed densities established in the experiment.
Agriengineering 02 00024 g005
Figure 6. Results showing weed coverage in each replication (Rep 1, Rep 2, and Rep 3 on the upper-left, upper-right, and bottom-left panels, respectively) for three different density treatment plots (low, medium, and high). The pixels pertaining to weeds and crop in the classified maps were analysed using a multi-step approach involving separation of inter-row weeds first using Hough transformation and then detection of intra-row weeds using random forest classifier. The weed pixel density heat maps were derived by first converting the classified pixels to point shape files and performing point kernel density analysis on the shapefiles.
Figure 6. Results showing weed coverage in each replication (Rep 1, Rep 2, and Rep 3 on the upper-left, upper-right, and bottom-left panels, respectively) for three different density treatment plots (low, medium, and high). The pixels pertaining to weeds and crop in the classified maps were analysed using a multi-step approach involving separation of inter-row weeds first using Hough transformation and then detection of intra-row weeds using random forest classifier. The weed pixel density heat maps were derived by first converting the classified pixels to point shape files and performing point kernel density analysis on the shapefiles.
Agriengineering 02 00024 g006
Figure 7. Linear regression showing the strength of association between weed pixel coverage (%) quantified using aerial imagery and overall weed density (no. of weeds m−2) in the quadrats determined by ground truthing.
Figure 7. Linear regression showing the strength of association between weed pixel coverage (%) quantified using aerial imagery and overall weed density (no. of weeds m−2) in the quadrats determined by ground truthing.
Agriengineering 02 00024 g007
Figure 8. Regression analysis of weed pixel coverage (%) obtained using aerial imagery and ground-based weed density (m−2) for (a) red sprangletop and (b) Palmer amaranth.
Figure 8. Regression analysis of weed pixel coverage (%) obtained using aerial imagery and ground-based weed density (m−2) for (a) red sprangletop and (b) Palmer amaranth.
Agriengineering 02 00024 g008
Table 1. Weed infestation levels in the low, medium, and high density plots in the experimental area.
Table 1. Weed infestation levels in the low, medium, and high density plots in the experimental area.
SpeciesAverage Weed Density (m−2)
LowMediumHighSize *
Palmer amaranth (Amaranthus palmeri S. Watson)25104 to 6 leaf stage (10–15 cm tall)
Red sprangletop (Leptochloa mucronata Michx.)28154 to 10 tiller stage (8–15 cm tall)
Mornigglories (Ipomoea spp.)1231 to 4 leaf stage
Texas millet (Urochloa texana Buckl.)0132 to 7 tiller stage (7–10 cm tall)
Devil’s claw (Proboscidea louisianica (Mill.) Thell.)1221 to 4 leaf stage
Total61833
* Size at the time of capturing the aerial image used in this research.
Table 2. Full set of image features tested in the study.
Table 2. Full set of image features tested in the study.
Feature TypeFeature Name aDescription
Spectral (N = 4)B, G, R, ExGMean values of all three channels and derived features for each grid object
Textural (N = 18)GLCM Homogeneity at 45° for R, G, and BSecond-order textural statistics based on Haralick et al. [49]
GLCM Homogeneity at 270° for R, G, and B
GLCM Contrast at 45° for R, G, and B
GLCM Contrast at 270° for R, G, and B
GLCM Entropy at 45° for R, G, and B
GLCM Entropy at 270° for R, G, and B
a R, G, B, ExG, and GLCM denote mean values for Red, Green, Blue bands, Excess Greenness Index, and Grey-Level Co-occurrence Matrix, respectively.

Share and Cite

MDPI and ACS Style

Sapkota, B.; Singh, V.; Cope, D.; Valasek, J.; Bagavathiannan, M. Mapping and Estimating Weeds in Cotton Using Unmanned Aerial Systems-Borne Imagery. AgriEngineering 2020, 2, 350-366. https://0-doi-org.brum.beds.ac.uk/10.3390/agriengineering2020024

AMA Style

Sapkota B, Singh V, Cope D, Valasek J, Bagavathiannan M. Mapping and Estimating Weeds in Cotton Using Unmanned Aerial Systems-Borne Imagery. AgriEngineering. 2020; 2(2):350-366. https://0-doi-org.brum.beds.ac.uk/10.3390/agriengineering2020024

Chicago/Turabian Style

Sapkota, Bishwa, Vijay Singh, Dale Cope, John Valasek, and Muthukumar Bagavathiannan. 2020. "Mapping and Estimating Weeds in Cotton Using Unmanned Aerial Systems-Borne Imagery" AgriEngineering 2, no. 2: 350-366. https://0-doi-org.brum.beds.ac.uk/10.3390/agriengineering2020024

Article Metrics

Back to TopTop