sensors-logo

Journal Browser

Journal Browser

Agricultural Sensing and Image Analysis

A special issue of Sensors (ISSN 1424-8220). This special issue belongs to the section "Remote Sensors".

Deadline for manuscript submissions: closed (15 February 2019) | Viewed by 49531

Special Issue Editors

Department of Biological Systems Engineering, University of Nebraska-Lincoln, Lincoln, NE 68583, USA
Interests: sensor-based plant phenotyping; optoelectronic sensor development in agriculture; VIS/NIR/MIR spectroscopy; agricultural remote sensing and image analysis; precision agriculture and spatial statistics
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
College of Engineering, University of Georgia, Athens, GA, USA
Interests: agricultural automation and robotics; sensing; machine learning; deep learning; computer vision; high throughput phenotyping; digital agriculture; precision agriculture

Special Issue Information

Dear Colleagues,

Agriculture has evolved into an information-intensive era. Imaging sensors are ubiquitously used to capture information regarding plants, animals, and natural resources (soil and water) across multiple scales, and at unprecedented resolutions in space and time. These imaging sensors are deployed on satellites and manned aerial platforms, and more recently on drones and field robotic systems. Imaging modules also quickly expand from RGB and multispectral to thermal infrared, hyperspectral, and 3D imaging (stereovision and LiDAR). Broad use of imaging sensors brings great opportunities as well as tremendous challenges. One such challenge is to process and analyze images and generate actionable recommendations in a timely and cost-effective manner for researchers and agriculture practitioners. This Special Issue invites papers that report innovative and advanced remote (and close-range) sensing applications in agriculture. Papers that address contemporary topics such as high throughput plant phenotyping, imaging sensor networks, and deep-learning based image analysis are encouraged.

Prof. Dr. Yufeng Ge
Prof. Dr. Changying (Charlie) Li
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Remote and close-range sensing
  • Plant health and stress
  • Animal health and welfare
  • Digital image processing
  • Convolutional neural network
  • Multimodal imaging and fusion

Published Papers (10 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

13 pages, 3948 KiB  
Article
A Phenotype-Based Approach for the Substrate Water Status Forecast of Greenhouse Netted Muskmelon
by Liying Chang, Yilu Yin, Jialin Xiang, Qian Liu, Daren Li and Danfeng Huang
Sensors 2019, 19(12), 2673; https://0-doi-org.brum.beds.ac.uk/10.3390/s19122673 - 13 Jun 2019
Cited by 4 | Viewed by 2789
Abstract
Cultivation substrate water status is of great importance to the production of netted muskmelon (Cucumis melo L. var. reticulatus Naud.). A prediction model for the substrate water status would be beneficial in irrigation schedule guidance. In this study, the machine learning random [...] Read more.
Cultivation substrate water status is of great importance to the production of netted muskmelon (Cucumis melo L. var. reticulatus Naud.). A prediction model for the substrate water status would be beneficial in irrigation schedule guidance. In this study, the machine learning random forest model was used to forecast plant substrate water status given the phenotypic traits throughout the muskmelon growing season. Here, two varieties of netted muskmelon, “Wanglu” and “Arus”, were planted in a greenhouse under four substrate water treatments and their phenotypic traits were measured by taking the images within the visible and near-infrared spectrums, respectively. Results showed that a simplified model outperformed the original model in forecasting speed, while it only uses the top five most significant contribution traits. The forecast accuracy reached up to 77.60%, 94.37%, and 90.01% for seedling, vine elongation, and fruit growth stages, respectively. Combining the imaging phenotypic traits and machine learning technique would provide a robust forecast of water status around the plant root zones. Full article
(This article belongs to the Special Issue Agricultural Sensing and Image Analysis)
Show Figures

Figure 1

16 pages, 11324 KiB  
Article
An Improved Multi-temporal and Multi-feature Tea Plantation Identification Method Using Sentinel-2 Imagery
by Jun Zhu, Ziwu Pan, Hang Wang, Peijie Huang, Jiulin Sun, Fen Qin and Zhenzhen Liu
Sensors 2019, 19(9), 2087; https://0-doi-org.brum.beds.ac.uk/10.3390/s19092087 - 05 May 2019
Cited by 36 | Viewed by 4294
Abstract
As tea is an important economic crop in many regions, efficient and accurate methods for remotely identifying tea plantations are essential for the implementation of sustainable tea practices and for periodic monitoring. In this study, we developed and tested a method for tea [...] Read more.
As tea is an important economic crop in many regions, efficient and accurate methods for remotely identifying tea plantations are essential for the implementation of sustainable tea practices and for periodic monitoring. In this study, we developed and tested a method for tea plantation identification based on multi-temporal Sentinel-2 images and a multi-feature Random Forest (RF) algorithm. We used phenological patterns of tea cultivation in China’s Shihe District (such as the multiple annual growing, harvest, and pruning stages) to extracted multi-temporal Sentinel-2 MSI bands, their derived first spectral derivative, NDVI and textures, and topographic features. We then assessed feature importance using RF analysis; the optimal combination of features was used as the input variable for RF classification to extract tea plantations in the study area. A comparison of our results with those achieved using the Support Vector Machine method and statistical data from local government departments showed that our method had a higher producer’s accuracy (96.57%) and user’s accuracy (96.02%). These results demonstrate that: (1) multi-temporal and multi-feature classification can improve the accuracy of tea plantation recognition, (2) RF classification feature importance analysis can effectively reduce feature dimensions and improve classification efficiency, and (3) the combination of multi-temporal Sentinel-2 images and the RF algorithm improves our ability to identify and monitor tea plantations. Full article
(This article belongs to the Special Issue Agricultural Sensing and Image Analysis)
Show Figures

Figure 1

17 pages, 5062 KiB  
Article
A Novel Illumination Compensation Technique for Multi-Spectral Imaging in NDVI Detection
by Rui Jiang, Pei Wang, Yan Xu, Zhiyan Zhou, Xiwen Luo and Yubin Lan
Sensors 2019, 19(8), 1859; https://0-doi-org.brum.beds.ac.uk/10.3390/s19081859 - 18 Apr 2019
Cited by 6 | Viewed by 3340
Abstract
To overcome the dependence on sunlight of multi-spectral cameras, an active light source multi-spectral imaging system was designed and a preliminary experimental study was conducted at night without solar interference. The system includes an active light source and a multi-spectral camera. The active [...] Read more.
To overcome the dependence on sunlight of multi-spectral cameras, an active light source multi-spectral imaging system was designed and a preliminary experimental study was conducted at night without solar interference. The system includes an active light source and a multi-spectral camera. The active light source consists of four integrated LED (Light Emitting Diode) arrays and adjustable constant current power supplies. The red LED arrays and the near-infrared LED arrays are each driven by an independently adjustable constant current power supply. The center wavelengths of the light source are 668 nm and 840 nm, which are consistent with that of filter lens of the Rededge-M multi-spectral camera. This paper shows that the radiation intensity measured is proportional to the drive current and is inversely proportional to the radiation distance, which is in accordance with the inverse square law of light. Taking the inverse square law of light into account, a radiation attenuation model was established based on the principle of image system and spatial geometry theory. After a verification test of the radiation attenuation model, it can be concluded that the average error between the radiation intensity obtained using this model and the actual measured value using a spectrometer is less than 0.0003 w/m2. In addition, the fitting curve of the multi-spectral image grayscale digital number (DN) and reflected radiation intensity at the 668 nm (Red light) is y = −3484230x2 + 721083x + 5558, with a determination coefficient of R2 = 0.998. The fitting curve with the 840 nm (near-infrared light) is y = 491469.88x + 3204, with a determination coefficient of R2 = 0.995, so the reflected radiation intensity on the plant canopy can be calculated according to the grayscale DN. Finally, the reflectance of red light and near-infrared light can be calculated, as well as the Normalized Difference Vegetation Index (NDVI) index. Based on the above model, four plants were placed at 2.85 m away from the active light source multi-spectral imaging system for testing. Meanwhile, NDVI index of each plant was measured by a Greenseeker hand-held crop sensor. The results show that the data from the two systems were linearly related and correlated with a coefficient of 0.995, indicating that the system in this article can effectively detect the vegetation NDVI index. If we want to use this technology for remote sensing in UAV, the radiation intensity attenuation and working distance of the light source are issues that need to be considered carefully. Full article
(This article belongs to the Special Issue Agricultural Sensing and Image Analysis)
Show Figures

Figure 1

14 pages, 6323 KiB  
Article
Detection and Classification of Root and Butt-Rot (RBR) in Stumps of Norway Spruce Using RGB Images and Machine Learning
by Ahmad Ostovar, Bruce Talbot, Stefano Puliti, Rasmus Astrup and Ola Ringdahl
Sensors 2019, 19(7), 1579; https://0-doi-org.brum.beds.ac.uk/10.3390/s19071579 - 01 Apr 2019
Cited by 10 | Viewed by 4348
Abstract
Root and butt-rot (RBR) has a significant impact on both the material and economic outcome of timber harvesting, and therewith on the individual forest owner and collectively on the forest and wood processing industries. An accurate recording of the presence of RBR during [...] Read more.
Root and butt-rot (RBR) has a significant impact on both the material and economic outcome of timber harvesting, and therewith on the individual forest owner and collectively on the forest and wood processing industries. An accurate recording of the presence of RBR during timber harvesting would enable a mapping of the location and extent of the problem, providing a basis for evaluating spread in a climate anticipated to enhance pathogenic growth in the future. Therefore, a system to automatically identify and detect the presence of RBR would constitute an important contribution to addressing the problem without increasing workload complexity for the machine operator. In this study, we developed and evaluated an approach based on RGB images to automatically detect tree stumps and classify them as to the absence or presence of rot. Furthermore, since knowledge of the extent of RBR is valuable in categorizing logs, we also classify stumps into three classes of infestation; rot = 0%, 0% < rot < 50% and rot ≥ 50%. In this work we used deep-learning approaches and conventional machine-learning algorithms for detection and classification tasks. The results showed that tree stumps were detected with precision rate of 95% and recall of 80%. Using only the correct output (TP) of the stump detector, stumps without and with RBR were correctly classified with accuracy of 83.5% and 77.5%, respectively. Classifying rot into three classes resulted in 79.4%, 72.4%, and 74.1% accuracy for stumps with rot = 0%, 0% < rot < 50%, and rot ≥ 50%, respectively. With some modifications, the developed algorithm could be used either during the harvesting operation to detect RBR regions on the tree stumps or as an RBR detector for post-harvest assessment of tree stumps and logs. Full article
(This article belongs to the Special Issue Agricultural Sensing and Image Analysis)
Show Figures

Figure 1

14 pages, 3500 KiB  
Article
3-D Image-Driven Morphological Crop Analysis: A Novel Method for Detection of Sunflower Broomrape Initial Subsoil Parasitism
by Ran Nisim Lati, Sagi Filin, Bashar Elnashef and Hanan Eizenberg
Sensors 2019, 19(7), 1569; https://0-doi-org.brum.beds.ac.uk/10.3390/s19071569 - 01 Apr 2019
Cited by 9 | Viewed by 3223
Abstract
Effective control of the parasitic weed sunflower broomrape (Orobanche cumana Wallr.) can be achieved by herbicides application in early parasitism stages. However, the growing environmental concerns associated with herbicide treatments have motivated the adoption of precise chemical control approaches that detect and [...] Read more.
Effective control of the parasitic weed sunflower broomrape (Orobanche cumana Wallr.) can be achieved by herbicides application in early parasitism stages. However, the growing environmental concerns associated with herbicide treatments have motivated the adoption of precise chemical control approaches that detect and treat infested areas exclusively. The main challenge in developing such control practices for O. cumana lies in the fact that most of its life-cycle occurs in the soil sub-surface and by the time shoots emerge and become observable, the damage to the crop is irreversible. This paper approaches early O. cumana detection by hypothesizing that its parasitism already impacts the host plant morphology at the sub-soil surface developmental stage. To validate this hypothesis, O. cumana- infested sunflower and non-infested control plants were grown in pots and imaged weekly over 45-day period. Three-dimensional plant models were reconstructed using image-based multi-view stereo followed by derivation of their morphological parameters, down to the organ-level. Among the parameters estimated, height and first internode length were the earliest definitive indicators of infection. Furthermore, the detection timing of both parameters was early enough for herbicide post-emergence application. Considering the fact that 3-D morphological modeling is nondestructive, is based on commercially available RGB sensors and can be used under natural illumination; this approach holds potential contribution for site specific pre-emergence managements of parasitic weeds and as a phenotyping tool in O. cumana resistant sunflower breeding projects. Full article
(This article belongs to the Special Issue Agricultural Sensing and Image Analysis)
Show Figures

Figure 1

15 pages, 4131 KiB  
Article
Controlled Lighting and Illumination-Independent Target Detection for Real-Time Cost-Efficient Applications. The Case Study of Sweet Pepper Robotic Harvesting
by Boaz Arad, Polina Kurtser, Ehud Barnea, Ben Harel, Yael Edan and Ohad Ben-Shahar
Sensors 2019, 19(6), 1390; https://0-doi-org.brum.beds.ac.uk/10.3390/s19061390 - 21 Mar 2019
Cited by 45 | Viewed by 4816
Abstract
Current harvesting robots are limited by low detection rates due to the unstructured and dynamic nature of both the objects and the environment. State-of-the-art algorithms include color- and texture-based detection, which are highly sensitive to the illumination conditions. Deep learning algorithms promise robustness [...] Read more.
Current harvesting robots are limited by low detection rates due to the unstructured and dynamic nature of both the objects and the environment. State-of-the-art algorithms include color- and texture-based detection, which are highly sensitive to the illumination conditions. Deep learning algorithms promise robustness at the cost of significant computational resources and the requirement for intensive databases. In this paper we present a Flash-No-Flash (FNF) controlled illumination acquisition protocol that frees the system from most ambient illumination effects and facilitates robust target detection while using only modest computational resources and no supervised training. The approach relies on the simultaneous acquisition of two images—with/without strong artificial lighting (“Flash”/“no-Flash”). The difference between these images represents the appearance of the target scene as if only the artificial light was present, allowing a tight control over ambient light for color-based detection. A performance evaluation database was acquired in greenhouse conditions using an eye-in-hand RGB camera mounted on a robotic manipulator. The database includes 156 scenes with 468 images containing a total of 344 yellow sweet peppers. Performance of both color blob and deep-learning detection algorithms are compared on Flash-only and FNF images. The collected database is made public. Full article
(This article belongs to the Special Issue Agricultural Sensing and Image Analysis)
Show Figures

Figure 1

19 pages, 5705 KiB  
Article
Identification of Wheat Yellow Rust Using Optimal Three-Band Spectral Indices in Different Growth Stages
by Qiong Zheng, Wenjiang Huang, Ximin Cui, Yingying Dong, Yue Shi, Huiqin Ma and Linyi Liu
Sensors 2019, 19(1), 35; https://0-doi-org.brum.beds.ac.uk/10.3390/s19010035 - 21 Dec 2018
Cited by 69 | Viewed by 5200
Abstract
Yellow rust, a widely known destructive wheat disease, affects wheat quality and causes large economic losses in wheat production. Hyperspectral remote sensing has shown potential for the detection of plant disease. This study aimed to analyze the spectral reflectance of the wheat canopy [...] Read more.
Yellow rust, a widely known destructive wheat disease, affects wheat quality and causes large economic losses in wheat production. Hyperspectral remote sensing has shown potential for the detection of plant disease. This study aimed to analyze the spectral reflectance of the wheat canopy in the range of 350–1000 nm and to develop optimal spectral indices to detect yellow rust disease in wheat at different growth stages. The sensitive wavebands of healthy and infected wheat were located in the range 460–720 nm in the early-mid growth stage (from booting to anthesis), and in the ranges 568–709 nm and 725–1000 nm in the mid-late growth stage (from filling to milky ripeness), respectively. All possible three-band combinations over these sensitive wavebands were calculated as the forms of PRI (Photochemical Reflectance Index) and ARI (Anthocyanin Reflectance Index) at different growth stages and assessed to determine whether they could be used for estimating the severity of yellow rust disease. The optimal spectral index for estimating wheat infected by yellow rust disease was PRI (570, 525, 705) during the early-mid growth stage with R2 of 0.669, and ARI (860, 790, 750) during the mid-late growth stage with R2 of 0.888. Comparison of the proposed spectral indices with previously reported vegetation indices were able to satisfactorily discriminate wheat yellow rust. The classification accuracy for PRI (570, 525, 705) was 80.6% and the kappa coefficient was 0.61 in early-mid growth stage, and the classification accuracy for ARI (860, 790, 750) was 91.9% and the kappa coefficient was 0.75 in mid-late growth stage. The classification accuracy of the two indices reached 84.1% and 93.2% in the early-mid and mid-late growth stages in the validated dataset, respectively. We conclude that the three-band spectral indices PRI (570, 525, 705) and ARI (860, 790, 750) are optimal for monitoring yellow rust infection in these two growth stages, respectively. Our method is expected to provide a technical basis for wheat disease detection and prevention in the early-mid growth stage, and the estimation of yield losses in the mid-late growth stage. Full article
(This article belongs to the Special Issue Agricultural Sensing and Image Analysis)
Show Figures

Figure 1

19 pages, 3730 KiB  
Article
Data Fusion of Two Hyperspectral Imaging Systems with Complementary Spectral Sensing Ranges for Blueberry Bruising Detection
by Shuxiang Fan, Changying Li, Wenqian Huang and Liping Chen
Sensors 2018, 18(12), 4463; https://0-doi-org.brum.beds.ac.uk/10.3390/s18124463 - 17 Dec 2018
Cited by 34 | Viewed by 4565
Abstract
Currently, the detection of blueberry internal bruising focuses mostly on single hyperspectral imaging (HSI) systems. Attempts to fuse different HSI systems with complementary spectral ranges are still lacking. A push broom based HSI system and a liquid crystal tunable filter (LCTF) based HSI [...] Read more.
Currently, the detection of blueberry internal bruising focuses mostly on single hyperspectral imaging (HSI) systems. Attempts to fuse different HSI systems with complementary spectral ranges are still lacking. A push broom based HSI system and a liquid crystal tunable filter (LCTF) based HSI system with different sensing ranges and detectors were investigated to jointly detect blueberry internal bruising in the lab. The mean reflectance spectrum of each berry sample was extracted from the data obtained by two HSI systems respectively. The spectral data from the two spectroscopic techniques were analyzed separately using feature selection method, partial least squares-discriminant analysis (PLS-DA), and support vector machine (SVM), and then fused with three data fusion strategies at the data level, feature level, and decision level. The three data fusion strategies achieved better classification results than using each HSI system alone. The decision level fusion integrating classification results from the two instruments with selected relevant features achieved more promising results, suggesting that the two HSI systems with complementary spectral ranges, combined with feature selection and data fusion strategies, could be used synergistically to improve blueberry internal bruising detection. This study was the first step in demonstrating the feasibility of the fusion of two HSI systems with complementary spectral ranges for detecting blueberry bruising, which could lead to a multispectral imaging system with a few selected wavelengths and an appropriate detector for bruising detection on the packing line. Full article
(This article belongs to the Special Issue Agricultural Sensing and Image Analysis)
Show Figures

Figure 1

12 pages, 2901 KiB  
Article
Insect Detection and Classification Based on an Improved Convolutional Neural Network
by Denan Xia, Peng Chen, Bing Wang, Jun Zhang and Chengjun Xie
Sensors 2018, 18(12), 4169; https://0-doi-org.brum.beds.ac.uk/10.3390/s18124169 - 27 Nov 2018
Cited by 136 | Viewed by 12815
Abstract
Regarding the growth of crops, one of the important factors affecting crop yield is insect disasters. Since most insect species are extremely similar, insect detection on field crops, such as rice, soybean and other crops, is more challenging than generic object detection. Presently, [...] Read more.
Regarding the growth of crops, one of the important factors affecting crop yield is insect disasters. Since most insect species are extremely similar, insect detection on field crops, such as rice, soybean and other crops, is more challenging than generic object detection. Presently, distinguishing insects in crop fields mainly relies on manual classification, but this is an extremely time-consuming and expensive process. This work proposes a convolutional neural network model to solve the problem of multi-classification of crop insects. The model can make full use of the advantages of the neural network to comprehensively extract multifaceted insect features. During the regional proposal stage, the Region Proposal Network is adopted rather than a traditional selective search technique to generate a smaller number of proposal windows, which is especially important for improving prediction accuracy and accelerating computations. Experimental results show that the proposed method achieves a heightened accuracy and is superior to the state-of-the-art traditional insect classification algorithms. Full article
(This article belongs to the Special Issue Agricultural Sensing and Image Analysis)
Show Figures

Figure 1

15 pages, 3174 KiB  
Article
Multi-Year Mapping of Major Crop Yields in an Irrigation District from High Spatial and Temporal Resolution Vegetation Index
by Bing Yu and Songhao Shang
Sensors 2018, 18(11), 3787; https://0-doi-org.brum.beds.ac.uk/10.3390/s18113787 - 06 Nov 2018
Cited by 19 | Viewed by 3214
Abstract
Crop yield estimation is important for formulating informed regional and national food trade policies. The introduction of remote sensing in agricultural monitoring makes accurate estimation of regional crop yields possible. However, remote sensing images and crop distribution maps with coarse spatial resolution usually [...] Read more.
Crop yield estimation is important for formulating informed regional and national food trade policies. The introduction of remote sensing in agricultural monitoring makes accurate estimation of regional crop yields possible. However, remote sensing images and crop distribution maps with coarse spatial resolution usually cause inaccuracy in yield estimation due to the existence of mixed pixels. This study aimed to estimate the annual yields of maize and sunflower in Hetao Irrigation District in North China using 30 m spatial resolution HJ-1A/1B CCD images and high accuracy multi-year crop distribution maps. The Normalized Difference Vegetation Index (NDVI) time series obtained from HJ-1A/1B CCD images was fitted with an asymmetric logistic curve to calculate daily NDVI and phenological characteristics. Eight random forest (RF) models using different predictors were developed for maize and sunflower yield estimation, respectively, where predictors of each model were a combination of NDVI series and/or phenological characteristics. We calibrated all RF models with measured crop yields at sampling points in two years (2014 and 2015), and validated the RF models with statistical yields of four counties in six years. Results showed that the optimal model for maize yield estimation was the model using NDVI series from the 120th to the 210th day in a year with 10 days’ interval as predictors, while that for sunflower was the model using the combination of three NDVI characteristics, three phenological characteristics, and two curve parameters as predictors. The selected RF models could estimate multi-year regional crop yields accurately, with the average values of root-mean-square error and the relative error of 0.75 t/ha and 6.1% for maize, and 0.40 t/ha and 10.1% for sunflower, respectively. Moreover, the yields of maize and sunflower can be estimated fairly well with NDVI series 50 days before crop harvest, which implicated the possibility of crop yield forecast before harvest. Full article
(This article belongs to the Special Issue Agricultural Sensing and Image Analysis)
Show Figures

Graphical abstract

Back to TopTop