remotesensing-logo

Journal Browser

Journal Browser

Phenotyping Technologies for Resistance Screening, Crop Breeding and Precision Agriculture

A special issue of Remote Sensing (ISSN 2072-4292). This special issue belongs to the section "Remote Sensing in Agriculture and Vegetation".

Deadline for manuscript submissions: closed (1 December 2023) | Viewed by 14928

Special Issue Editors

College of Engineering, China Agricultural University, Beijing 100083, China
Interests: smart urban agriculture; artificial intelligence; agricultural robotics; automated control; unmanned aerial vehicle; plant phenotyping; computer vision; crop plant signaling; machine (deep) learning; food processing and safety; fluorescence imaging; hyper/multispectral imaging; Vis/NIR/MIR imaging spectroscopy
Special Issues, Collections and Topics in MDPI journals
Precision Agriculture Center, University of Minnesota, St. Paul, MN 55108, USA
Interests: precision agriculture; proximal and remote sensing; precision nitrogen management; integration of crop growth modeling; remote sensing and machine/deep learning; integrated precision crop management; food security and sustainable development
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

Climate change poses a great threat to sustainable food production worldwide, but the rapid growth of human demand for foods requires that plant yields have to continue to increase each year. The optimization of soil management, early diagnosis of crop diseases, and breeding of resistant cultivars are the keys to increasing global food production. Phenotyping technologies, from proximal to remote sensing, allow for rapid monitoring of orchards or crops at different scales, including single leaf, individual plant, and field, thereby informing the genetics of plant traits such as growth, development, resistance, architecture, physiology, and nutrients.

This Special Issue is looking for studies covering different phenotyping technologies such as RGB imaging, fluorescence imaging, multi/hyperspectral imaging, and related platforms to provide important information about how environmental stress, genetics, and precision management guide the selection of productive plants.

Topics may cover anything on the automatic identification and assessment of plant traits, including stress tolerance (e.g., biotic and abiotic stresses), chemical aspects (e.g., nutrients, secondary metabolites), and structural and functional aspects such as leaf characteristics, plant height, photosynthetic efficiency, root morphology, fruit traits, biomass, and yield, from individual plant organs to full fields. Hence, different sensing techniques and multiple scales of phenotyping platforms (e.g., ground vehicles, unmanned aerial vehicles, and satellites) or studies focused on resistance screening, crop breeding, precision agriculture and other related issues are welcome. Articles may address, but are not limited to, the following topics:

  • Remote sensing applications for precision agriculture;
  • Assessment of plant growth status;
  • Crop nutrient management;
  • Nitrogen management;
  • Plant phenotypes;
  • Plant instance detection;
  • Resistance of plants to biotic and abiotic stresses;
  • Smart sensing, monitoring, and control
  • Smart breeding;
  • Smart farming.

Dr. Wen-Hao Su
Dr. Yuxin Miao
Dr. Zhou Zhang
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Remote Sensing is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2700 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • sustainable agriculture
  • proximal and remote sensing techniques
  • phenotyping sensors
  • non-destructive measurements
  • high-throughput screening
  • plant stress responses
  • nutrient, weed and disease diagnosis
  • computer vision
  • deep learning
  • image processing

Published Papers (5 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

15 pages, 6148 KiB  
Article
Quantitative Evaluation of Maize Emergence Using UAV Imagery and Deep Learning
by Minguo Liu, Wen-Hao Su and Xi-Qing Wang
Remote Sens. 2023, 15(8), 1979; https://0-doi-org.brum.beds.ac.uk/10.3390/rs15081979 - 09 Apr 2023
Cited by 7 | Viewed by 2131
Abstract
Accurate assessment of crop emergence helps breeders select appropriate crop genotypes, and farmers make timely field management decisions to increase maize yields. Crop emergence is conventionally quantified by manual calculations to quantify the number and size of seedlings, which is laborious, inefficient, and [...] Read more.
Accurate assessment of crop emergence helps breeders select appropriate crop genotypes, and farmers make timely field management decisions to increase maize yields. Crop emergence is conventionally quantified by manual calculations to quantify the number and size of seedlings, which is laborious, inefficient, and unreliable and fails to visualize the spatial distribution and uniformity of seedlings. Phenotyping technology based on remote sensing allows for high-throughput evaluation of crop emergence at the early growth stage. This study developed a system for the rapid estimation of maize seedling emergence based on a deep learning algorithm. The RGB images acquired from an unmanned aerial vehicle (UAV) were used to develop the optimal model for the recognition of seedling location, spacing, and size, and the prediction performance of the system was evaluated in three stations during 2021–2022. A case study was conducted to show the evaluation of the system for maize seedlings and combined with TOPSIS (Technique for Order of Preference by Similarity to Ideal Solution) analysis. The results show that the system has good prediction performance for maize seedling count with an average R2 value of 0.96 and an accuracy of 92%; however, shadows and planting density influence its accuracy. The prediction accuracy reduces significantly when the planting density is above 90,000 plants/ha. The distribution characteristics of seedling emergence and growth were also calculated based on the average value and variation coefficient of seedling spacing, seedling area, and seedling length. The estimation accuracies for the average value of seedling spacing, the coefficient of variation of seedling spacing, the average value of the seedling area, the coefficient of variation of the seedling area, and the average value of the seedling length were 87.52, 87.55, 82.69, 84.51, and 90.32%, respectively. In conclusion, the proposed system can quickly analyze the maize seeding growth and uniformity characteristics of experimental plots and locate plots with poor maize emergence. Full article
Show Figures

Figure 1

18 pages, 26069 KiB  
Article
Identification and Counting of Sugarcane Seedlings in the Field Using Improved Faster R-CNN
by Yuyun Pan, Nengzhi Zhu, Lu Ding, Xiuhua Li, Hui-Hwang Goh, Chao Han and Muqing Zhang
Remote Sens. 2022, 14(22), 5846; https://0-doi-org.brum.beds.ac.uk/10.3390/rs14225846 - 18 Nov 2022
Cited by 9 | Viewed by 1899
Abstract
Sugarcane seedling emergence is important for sugar production. Manual counting is time-consuming and hardly practicable for large-scale field planting. Unmanned aerial vehicles (UAVs) with fast acquisition speed and wide coverage are becoming increasingly popular in precision agriculture. We provide a method based on [...] Read more.
Sugarcane seedling emergence is important for sugar production. Manual counting is time-consuming and hardly practicable for large-scale field planting. Unmanned aerial vehicles (UAVs) with fast acquisition speed and wide coverage are becoming increasingly popular in precision agriculture. We provide a method based on improved Faster RCNN for automatically detecting and counting sugarcane seedlings using aerial photography. The Sugarcane-Detector (SGN-D) uses ResNet 50 for feature extraction to produce high-resolution feature expressions and provides an attention method (SN-block) to focus the network on learning seedling feature channels. FPN aggregates multi-level features to tackle multi-scale problems, while optimizing anchor boxes for sugarcane size and quantity. To evaluate the efficacy and viability of the proposed technology, 238 images of sugarcane seedlings were taken from the air with an unmanned aerial vehicle. Outcoming with an average accuracy of 93.67%, our proposed method outperforms other commonly used detection models, including the original Faster R-CNN, SSD, and YOLO. In order to eliminate the error caused by repeated counting, we further propose a seedlings de-duplication algorithm. The highest counting accuracy reached 96.83%, whilst the mean absolute error (MAE) reached 4.6 when intersection of union (IoU) was 0.15. In addition, a software system was developed for the automatic identification and counting of cane seedlings. This work can provide accurate seedling data, thus can support farmers making proper cultivation management decision. Full article
Show Figures

Figure 1

21 pages, 5537 KiB  
Article
UAV-Based Multi-Temporal Thermal Imaging to Evaluate Wheat Drought Resistance in Different Deficit Irrigation Regimes
by Weilong Qin, Jian Wang, Longfei Ma, Falv Wang, Naiyue Hu, Xianyue Yang, Yiyang Xiao, Yinghua Zhang, Zhencai Sun, Zhimin Wang and Kang Yu
Remote Sens. 2022, 14(21), 5608; https://0-doi-org.brum.beds.ac.uk/10.3390/rs14215608 - 07 Nov 2022
Cited by 9 | Viewed by 2548
Abstract
Deficit irrigation is a common approach in water-scarce regions to balance productivity and water use, whereas drought stress still occurs to various extents, leading to reduced physiological performance and a decrease in yield. Therefore, seeking a rapid and reliable method to identify wheat [...] Read more.
Deficit irrigation is a common approach in water-scarce regions to balance productivity and water use, whereas drought stress still occurs to various extents, leading to reduced physiological performance and a decrease in yield. Therefore, seeking a rapid and reliable method to identify wheat varieties with drought resistance can help reduce yield loss under water deficit. In this study, we compared ten wheat varieties under three deficit irrigation systems (W0, no irrigation during the growing season; W1, irrigation at jointing; W2, irrigation at jointing and anthesis). UAV thermal imagery, plant physiological traits [leaf area index (LAI), SPAD, photosynthesis (Pn), transpiration (Tr), stomatal conductance (Cn)], biomass and yield were acquired at different growth stages. Wheat drought resistance performance was evaluated through using the canopy temperature extracted from UAV thermal imagery (CT-UAV), in combination with hierarchical cluster analysis (HCA). The CT-UAV of W0 and W1 treatments was significantly higher than in the W2 treatment, with the ranges of 24.8–33.3 °C, 24.3–31.6 °C, and 24.1–28.9 °C in W0, W1 and W2, respectively. We found negative correlations between CT-UAV and LAI, SPAD, Pn, Tr, Cn and biomass under the W0 (R2 = 0.41–0.79) and W1 treatments (R2 = 0.22–0.72), but little relevance for W2 treatment. Under the deficit irrigation treatments (W0 and W1), UAV thermal imagery was less effective before the grain-filling stage in evaluating drought resistance. This study demonstrates the potential of ensuring yield and saving irrigation water by identifying suitable wheat varieties for different water-scarce irrigation scenarios. Full article
Show Figures

Figure 1

16 pages, 2740 KiB  
Article
Estimation of Maize Yield and Flowering Time Using Multi-Temporal UAV-Based Hyperspectral Data
by Jiahao Fan, Jing Zhou, Biwen Wang, Natalia de Leon, Shawn M. Kaeppler, Dayane C. Lima and Zhou Zhang
Remote Sens. 2022, 14(13), 3052; https://0-doi-org.brum.beds.ac.uk/10.3390/rs14133052 - 25 Jun 2022
Cited by 14 | Viewed by 3800
Abstract
Maize (Zea mays L.) is one of the most consumed grains in the world. Within the context of continuous climate change and the reduced availability of arable land, it is urgent to breed new maize varieties and screen for the desired traits, [...] Read more.
Maize (Zea mays L.) is one of the most consumed grains in the world. Within the context of continuous climate change and the reduced availability of arable land, it is urgent to breed new maize varieties and screen for the desired traits, e.g., high yield and strong stress tolerance. Traditional phenotyping methods relying on manual assessment are time-consuming and prone to human errors. Recently, the application of uncrewed aerial vehicles (UAVs) has gained increasing attention in plant phenotyping due to their efficiency in data collection. Moreover, hyperspectral sensors integrated with UAVs can offer data streams with high spectral and spatial resolutions, which are valuable for estimating plant traits. In this study, we collected UAV hyperspectral imagery over a maize breeding field biweekly across the growing season, resulting in 11 data collections in total. Multiple machine learning models were developed to estimate the grain yield and flowering time of the maize breeding lines using the hyperspectral imagery. The performance of the machine learning models and the efficacy of different hyperspectral features were evaluated. The results showed that the models with the multi-temporal imagery outperformed those with imagery from single data collections, and the ridge regression using the full band reflectance achieved the best estimation accuracies, with the correlation coefficients (r) between the estimates and ground truth of 0.54 for grain yield, 0.91 for days to silking, and 0.92 for days to anthesis. In addition, we assessed the estimation performance with data acquired at different growth stages to identify the good periods for the UAV survey. The best estimation results were achieved using the data collected around the tasseling stage (VT) for the grain yield estimation and around the reproductive stages (R1 or R4) for the flowering time estimation. Our results showed that the robust phenotyping framework proposed in this study has great potential to help breeders efficiently estimate key agronomic traits at early growth stages. Full article
Show Figures

Figure 1

17 pages, 3668 KiB  
Article
Two-Stage Convolutional Neural Networks for Diagnosing the Severity of Alternaria Leaf Blotch Disease of the Apple Tree
by Bo-Yuan Liu, Ke-Jun Fan, Wen-Hao Su and Yankun Peng
Remote Sens. 2022, 14(11), 2519; https://0-doi-org.brum.beds.ac.uk/10.3390/rs14112519 - 24 May 2022
Cited by 23 | Viewed by 3017
Abstract
In many parts of the world, apple trees suffer from severe foliar damage each year due to infection of Alternaria blotch (Alternaria alternata f. sp. Mali), resulting in serious economic losses to growers. Traditional methods for disease detection and severity classification mostly rely [...] Read more.
In many parts of the world, apple trees suffer from severe foliar damage each year due to infection of Alternaria blotch (Alternaria alternata f. sp. Mali), resulting in serious economic losses to growers. Traditional methods for disease detection and severity classification mostly rely on manual labor, which is slow, labor-intensive and highly subjective. There is an urgent need to develop an effective protocol to rapidly and accurately evaluate disease severity. In this study, DeeplabV3+, PSPNet and UNet were used to assess the severity of apple Alternaria leaf blotch. For identifications of leaves and disease areas, the dataset with a total of 5382 samples was randomly split into 74% (4004 samples) for model training, 9% (494 samples) for validation, 8% (444 samples) for testing and 8% (440 samples) for overall testing. Apple leaves were first segmented from complex backgrounds using the deep-learning algorithms with different backbones. Then, the recognition of disease areas was performed on the segmented leaves. The results showed that the PSPNet model with MobileNetV2 backbone exhibited the highest performance in leaf segmentation, with precision, recall and MIoU values of 99.15%, 99.26% and 98.42%, respectively. The UNet model with VGG backbone performed the best in disease-area prediction, with a precision of 95.84%, a recall of 95.54% and a MIoU value of 92.05%. The ratio of disease area to leaf area was calculated to assess the disease severity. The results showed that the average accuracy for severity classification was 96.41%. Moreover, both the correlation coefficient and the consistency correlation coefficient were 0.992, indicating a high agreement between the reference values and the value that the research predicted. This study proves the feasibility of rapid estimation of the severity of apple Alternaria leaf blotch, which will provide technical support for precise application of pesticides. Full article
Show Figures

Graphical abstract

Back to TopTop