Next Article in Journal
Parameter Calibration of Pig Manure with Discrete Element Method Based on JKR Contact Model
Next Article in Special Issue
CHAP: Cotton-Harvesting Autonomous Platform
Previous Article in Journal
Multi-Bale Handling Unit for Efficient Logistics
Previous Article in Special Issue
A Plastic Contamination Image Dataset for Deep Learning Model Development and Training
Article

Mapping and Estimating Weeds in Cotton Using Unmanned Aerial Systems-Borne Imagery

1
Department of Soil and Crop Sciences, Texas A&M University, College Station, TX 77843, USA
2
Department of Mechanical Engineering, Texas A&M University, College Station, TX 77843, USA
3
Department of Aerospace Engineering, Texas A&M University, College Station, TX 77843, USA
*
Author to whom correspondence should be addressed.
Received: 10 May 2020 / Revised: 3 June 2020 / Accepted: 7 June 2020 / Published: 16 June 2020
(This article belongs to the Special Issue Feature Papers in Cotton Automation, Machine Vision and Robotics)
In recent years, Unmanned Aerial Systems (UAS) have emerged as an innovative technology to provide spatio-temporal information about weed species in crop fields. Such information is a critical input for any site-specific weed management program. A multi-rotor UAS (Phantom 4) equipped with an RGB sensor was used to collect imagery in three bands (Red, Green, and Blue; 0.8 cm/pixel resolution) with the objectives of (a) mapping weeds in cotton and (b) determining the relationship between image-based weed coverage and ground-based weed densities. For weed mapping, three different weed density levels (high, medium, and low) were established for a mix of different weed species, with three replications. To determine weed densities through ground truthing, five quadrats (1 m × 1 m) were laid out in each plot. The aerial imageries were preprocessed and subjected to Hough transformation to delineate cotton rows. Following the separation of inter-row vegetation from crop rows, a multi-level classification coupled with machine learning algorithms were used to distinguish intra-row weeds from cotton. Overall, accuracy levels of 89.16%, 85.83%, and 83.33% and kappa values of 0.84, 0.79, and 0.75 were achieved for detecting weed occurrence in high, medium, and low density plots, respectively. Further, ground-truthing based overall weed density values were fairly correlated (r2 = 0.80) with image-based weed coverage assessments. Among the specific weed species evaluated, Palmer amaranth (Amaranthus palmeri S. Watson) showed the highest correlation (r2 = 0.91) followed by red sprangletop (Leptochloa mucronata Michx) (r2 = 0.88). The results highlight the utility of UAS-borne RGB imagery for weed mapping and density estimation in cotton for precision weed management. View Full-Text
Keywords: digital agronomy; Hough transformation; machine learning; object-based image analysis; precision agriculture digital agronomy; Hough transformation; machine learning; object-based image analysis; precision agriculture
Show Figures

Figure 1

MDPI and ACS Style

Sapkota, B.; Singh, V.; Cope, D.; Valasek, J.; Bagavathiannan, M. Mapping and Estimating Weeds in Cotton Using Unmanned Aerial Systems-Borne Imagery. AgriEngineering 2020, 2, 350-366. https://0-doi-org.brum.beds.ac.uk/10.3390/agriengineering2020024

AMA Style

Sapkota B, Singh V, Cope D, Valasek J, Bagavathiannan M. Mapping and Estimating Weeds in Cotton Using Unmanned Aerial Systems-Borne Imagery. AgriEngineering. 2020; 2(2):350-366. https://0-doi-org.brum.beds.ac.uk/10.3390/agriengineering2020024

Chicago/Turabian Style

Sapkota, Bishwa, Vijay Singh, Dale Cope, John Valasek, and Muthukumar Bagavathiannan. 2020. "Mapping and Estimating Weeds in Cotton Using Unmanned Aerial Systems-Borne Imagery" AgriEngineering 2, no. 2: 350-366. https://0-doi-org.brum.beds.ac.uk/10.3390/agriengineering2020024

Find Other Styles

Article Access Map by Country/Region

1
Back to TopTop