Feature Papers in Cotton Automation, Machine Vision and Robotics

A special issue of AgriEngineering (ISSN 2624-7402).

Deadline for manuscript submissions: closed (31 October 2021) | Viewed by 29120

Special Issue Editors


E-Mail Website
Guest Editor
Cotton Production and Processing Research Unit, United States Department of Agriculture, Agricultural Research Services, Lubbock, TX 79403, USA
Interests: instrumentation and sensor development; robotics; microwave sensing; machine vision; artificial intelligence
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
Senior Director of Agricultural & Environmental Research, Cotton Incorporated, Cary, NC 27513, USA
Interests: remote sensing; cotton engineering (ginning); precision farming; irrigation and conservation; tillage systems research

Special Issue Information

Dear Colleagues,

With the advent of the recent leaps in artificial intelligence, we are on the verge of a new explosion of autonomous systems and machines. This Special Issue is aimed at bringing together recent developments related to robotics and automation with respect to their potential or proven capabilities as applied in cotton production and post-harvest process applications. Contributions are expected to deal with, but are not limited to, the following areas:

  • Robotics
  • Machine vision, computer vision
  • Automation
  • Artificial intelligence
  • Deep learning, neural networks, convolutional networks
  • Radar
  • Lidar
  • Sensors
  • Instrumentation
Dr. Mathew G. Pelletier
Dr. Edward M. Barnes
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. AgriEngineering is an international peer-reviewed open access quarterly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • cotton
  • robotics
  • automation
  • radar
  • lidar
  • machine learning
  • machine vision
  • artificial intelligence
  • deep neural networks
  • convolutional networks

Published Papers (6 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Other

19 pages, 9732 KiB  
Article
CHAP: Cotton-Harvesting Autonomous Platform
by Joe Mari Maja, Matthew Polak, Marlowe Edgar Burce and Edward Barnes
AgriEngineering 2021, 3(2), 199-217; https://0-doi-org.brum.beds.ac.uk/10.3390/agriengineering3020013 - 09 Apr 2021
Cited by 12 | Viewed by 4156
Abstract
The US cotton industry provided over 190,000 jobs and more than $28 billion total economic contributions to the United States in 2012. The US is the third-largest cotton-producing country in the world, following India and China. US cotton producers have been able to [...] Read more.
The US cotton industry provided over 190,000 jobs and more than $28 billion total economic contributions to the United States in 2012. The US is the third-largest cotton-producing country in the world, following India and China. US cotton producers have been able to stay competitive with countries like India and China by adopting the latest technologies. Despite the success of technology adoption, there are still many challenges, e.g., increased pest resistance, mainly glyphosate resistant weeds, and early indications of bollworm resistance to Bt cotton (genetically modified cotton that contains genes for an insecticide). Commercial small unmanned ground vehicle (UGV) or mobile ground robots with navigation-sensing modality provide a platform to increase farm management efficiency. The platform can be retrofitted with different implements that perform a specific task, e.g., spraying, scouting (having multiple sensors), phenotyping, harvesting, etc. This paper presents a proof-of-concept cotton harvesting robot. The robot was retrofitted with a vacuum-type system with a small storage bin. A single harvesting nozzle was used and positioned based on where most cotton bolls were expected. The idea is to create a simplified system where cotton bolls′ localization was undertaken as a posteriori information, rather than a real-time cotton boll detection. Performance evaluation for the cotton harvesting was performed in terms of how effective the harvester suctions the cotton bolls and the effective distance of the suction to the cotton bolls. Preliminary results on field test showed an average of 57.4% success rate in harvesting locks about 12 mm from the harvester nozzle. The results showed that 40.7% was harvested on Row A while 74.1% in Row B for the two-row test. Although both results were promising, further improvements are needed in the design of the harvesting module to make it suitable for farm applications. Full article
(This article belongs to the Special Issue Feature Papers in Cotton Automation, Machine Vision and Robotics)
Show Figures

Figure 1

17 pages, 4131 KiB  
Article
Mapping and Estimating Weeds in Cotton Using Unmanned Aerial Systems-Borne Imagery
by Bishwa Sapkota, Vijay Singh, Dale Cope, John Valasek and Muthukumar Bagavathiannan
AgriEngineering 2020, 2(2), 350-366; https://0-doi-org.brum.beds.ac.uk/10.3390/agriengineering2020024 - 16 Jun 2020
Cited by 15 | Viewed by 5337
Abstract
In recent years, Unmanned Aerial Systems (UAS) have emerged as an innovative technology to provide spatio-temporal information about weed species in crop fields. Such information is a critical input for any site-specific weed management program. A multi-rotor UAS (Phantom 4) equipped with an [...] Read more.
In recent years, Unmanned Aerial Systems (UAS) have emerged as an innovative technology to provide spatio-temporal information about weed species in crop fields. Such information is a critical input for any site-specific weed management program. A multi-rotor UAS (Phantom 4) equipped with an RGB sensor was used to collect imagery in three bands (Red, Green, and Blue; 0.8 cm/pixel resolution) with the objectives of (a) mapping weeds in cotton and (b) determining the relationship between image-based weed coverage and ground-based weed densities. For weed mapping, three different weed density levels (high, medium, and low) were established for a mix of different weed species, with three replications. To determine weed densities through ground truthing, five quadrats (1 m × 1 m) were laid out in each plot. The aerial imageries were preprocessed and subjected to Hough transformation to delineate cotton rows. Following the separation of inter-row vegetation from crop rows, a multi-level classification coupled with machine learning algorithms were used to distinguish intra-row weeds from cotton. Overall, accuracy levels of 89.16%, 85.83%, and 83.33% and kappa values of 0.84, 0.79, and 0.75 were achieved for detecting weed occurrence in high, medium, and low density plots, respectively. Further, ground-truthing based overall weed density values were fairly correlated (r2 = 0.80) with image-based weed coverage assessments. Among the specific weed species evaluated, Palmer amaranth (Amaranthus palmeri S. Watson) showed the highest correlation (r2 = 0.91) followed by red sprangletop (Leptochloa mucronata Michx) (r2 = 0.88). The results highlight the utility of UAS-borne RGB imagery for weed mapping and density estimation in cotton for precision weed management. Full article
(This article belongs to the Special Issue Feature Papers in Cotton Automation, Machine Vision and Robotics)
Show Figures

Figure 1

Other

Jump to: Research

25 pages, 8263 KiB  
Technical Note
Cotton Gin Stand Machine-Vision Inspection and Removal System for Plastic Contamination: Software Design
by Mathew G. Pelletier, Greg A. Holt and John D. Wanjura
AgriEngineering 2021, 3(3), 494-518; https://0-doi-org.brum.beds.ac.uk/10.3390/agriengineering3030033 - 08 Jul 2021
Cited by 9 | Viewed by 3765
Abstract
The removal of plastic contamination from cotton lint is an issue of top priority to the U.S. cotton industry. One of the main sources of plastic contamination showing up in marketable cotton bales is plastic used to wrap cotton modules produced by John [...] Read more.
The removal of plastic contamination from cotton lint is an issue of top priority to the U.S. cotton industry. One of the main sources of plastic contamination showing up in marketable cotton bales is plastic used to wrap cotton modules produced by John Deere round module harvesters. Despite diligent efforts by cotton ginning personnel to remove all plastic encountered during module unwrapping, plastic still finds a way into the cotton gin’s processing system. To help mitigate plastic contamination at the gin, a machine-vision detection and removal system was developed that utilizes low-cost color cameras to see plastic coming down the gin-stand feeder apron, which upon detection, blows plastic out of the cotton stream to prevent contamination. This paper presents the software design of this inspection and removal system. The system was tested throughout the entire 2019 cotton ginning season at two commercial cotton gins and at one gin in the 2018 ginning season. The focus of this report is to describe the software design and discuss relevant issues that influenced the design of the software. Full article
(This article belongs to the Special Issue Feature Papers in Cotton Automation, Machine Vision and Robotics)
Show Figures

Figure 1

24 pages, 2145 KiB  
Commentary
Opportunities for Robotic Systems and Automation in Cotton Production
by Edward Barnes, Gaylon Morgan, Kater Hake, Jon Devine, Ryan Kurtz, Gregory Ibendahl, Ajay Sharda, Glen Rains, John Snider, Joe Mari Maja, J. Alex Thomasson, Yuzhen Lu, Hussein Gharakhani, James Griffin, Emi Kimura, Robert Hardin, Tyson Raper, Sierra Young, Kadeghe Fue, Mathew Pelletier, John Wanjura and Greg Holtadd Show full author list remove Hide full author list
AgriEngineering 2021, 3(2), 339-362; https://0-doi-org.brum.beds.ac.uk/10.3390/agriengineering3020023 - 28 May 2021
Cited by 21 | Viewed by 7046
Abstract
Automation continues to play a greater role in agricultural production with commercial systems now available for machine vision identification of weeds and other pests, autonomous weed control, and robotic harvesters for fruits and vegetables. The growing availability of autonomous machines in agriculture indicates [...] Read more.
Automation continues to play a greater role in agricultural production with commercial systems now available for machine vision identification of weeds and other pests, autonomous weed control, and robotic harvesters for fruits and vegetables. The growing availability of autonomous machines in agriculture indicates that there are opportunities to increase automation in cotton production. This article considers how current and future advances in automation has, could, or will impact cotton production practices. The results are organized to follow the cotton production process from land preparation to planting to within season management through harvesting and ginning. For each step, current and potential opportunities to automate processes are discussed. Specific examples include advances in automated weed control and progress made in the use of robotic systems for cotton harvesting. Full article
(This article belongs to the Special Issue Feature Papers in Cotton Automation, Machine Vision and Robotics)
Show Figures

Figure 1

5 pages, 2333 KiB  
Technical Note
A Plastic Contamination Image Dataset for Deep Learning Model Development and Training
by Mathew G. Pelletier, Greg A. Holt and John D. Wanjura
AgriEngineering 2020, 2(2), 317-321; https://0-doi-org.brum.beds.ac.uk/10.3390/agriengineering2020021 - 22 May 2020
Cited by 3 | Viewed by 3859
Abstract
The removal of plastic contamination in cotton lint is an issue of top priority for the U.S. cotton industry. One of the main sources of plastic contamination appearing in marketable cotton bales is plastic used to wrap cotton modules on cotton harvesters. To [...] Read more.
The removal of plastic contamination in cotton lint is an issue of top priority for the U.S. cotton industry. One of the main sources of plastic contamination appearing in marketable cotton bales is plastic used to wrap cotton modules on cotton harvesters. To help mitigate plastic contamination at the gin, automatic inspection systems are needed to detect and control removal systems. Due to significant cost constraints in the U.S. cotton ginning industry, the use of low-cost color cameras for detection of plastic contamination has been successfully adopted. However, some plastics of similar color to background are difficult to detect when utilizing traditional machine learning algorithms. Hence, current detection/removal system designs are not able to remove all plastics and there is still a need for better detection methods. Recent advances in deep learning convolutional neural networks (CNNs) show promise for enabling the use of low-cost color cameras for detection of objects of interest when placed against a background of similar color. They do this by mimicking the human visual detection system, focusing on differences in texture rather than color as the primary detection paradigm. The key to leveraging the CNNs is the development of extensive image datasets required for training. One of the impediments to this methodology is the need for large image datasets where each image must be annotated with bounding boxes that surround each object of interest. As this requirement is labor-intensive, there is significant value in these image datasets. This report details the included image dataset as well as the system design used to collect the images. For acquisition of the image dataset, a prototype detection system was developed and deployed into a commercial cotton gin where images were collected for the duration of the 2018–2019 ginning season. A discussion of the observational impact that the system had on reduction of plastic contamination at the commercial gin, utilizing traditional color-based machine learning algorithms, is also included. Full article
(This article belongs to the Special Issue Feature Papers in Cotton Automation, Machine Vision and Robotics)
Show Figures

Figure 1

14 pages, 6969 KiB  
Technical Note
A Cotton Module Feeder Plastic Contamination Inspection System
by Mathew G. Pelletier, Greg A. Holt and John D. Wanjura
AgriEngineering 2020, 2(2), 280-293; https://0-doi-org.brum.beds.ac.uk/10.3390/agriengineering2020018 - 20 May 2020
Cited by 12 | Viewed by 3785
Abstract
The removal of plastic contamination in cotton lint is an issue of top priority to the U.S. cotton industry. One of the main sources of plastic contamination showing up in marketable cotton bales, at the U.S. Department of Agriculture’s classing office, is plastic [...] Read more.
The removal of plastic contamination in cotton lint is an issue of top priority to the U.S. cotton industry. One of the main sources of plastic contamination showing up in marketable cotton bales, at the U.S. Department of Agriculture’s classing office, is plastic from the module wrap used to wrap cotton modules produced by the new John Deere round module harvesters. Despite diligent efforts by cotton ginning personnel to remove all plastic encountered during unwrapping of the seed cotton modules, plastic still finds a way into the cotton gin’s processing system. To help mitigate plastic contamination at the gin; an inspection system was developed that utilized low-cost color cameras to see plastic on the module feeder’s dispersing cylinders, that are normally hidden from view by the incoming feed of cotton modules. This technical note presents the design of an automated intelligent machine-vision guided cotton module-feeder inspection system. The system includes a machine-learning program that automatically detects plastic contamination in order to alert the cotton gin personnel as to the presence of plastic contamination on the module feeder’s dispersing cylinders. The system was tested throughout the entire 2019 cotton ginning season at two commercial cotton gins and at one gin in the 2018 ginning season. This note describes the over-all system and mechanical design and provides an over-view and coverage of key relevant issues. Included as an attachment to this technical note are all the mechanical engineering design files as well as the bill-of-materials part source list. A discussion of the observational impact the system had on reduction of plastic contamination is also addressed. Full article
(This article belongs to the Special Issue Feature Papers in Cotton Automation, Machine Vision and Robotics)
Show Figures

Figure 1

Back to TopTop