Special Issue "Acquire and Perceive: Novel Approaches for Imaging-Based Plant Phenotyping"

A special issue of Remote Sensing (ISSN 2072-4292). This special issue belongs to the section "Remote Sensing Image Processing".

Deadline for manuscript submissions: 31 March 2022.

Special Issue Editors

Dr. Mario Valerio Giuffrida
E-Mail Website
Guest Editor
School of Computing, Edinburgh Napier University, Edinburgh, UK
Interests: computer vision; deep learning; unsupervised domain adaptation; plant image analysis
Special Issues and Collections in MDPI journals
Dr. Iftach Klapp
E-Mail Website
Guest Editor
Information and Mechanization Engineering, Department of Sensing, Institute of Agricultural Engineering, Agricultural Research Organization the Volcani Center, p.o.b. 6, Bet Dagan, 50250 Israel
Interests: environmental optical acquisition for agriculture tasks; computational optics; optical design inverse problems; learning
Dr. Aharon Bar-Hillel
E-Mail Website
Guest Editor
Ben-Gurion University of the Negev, P.O.B. 653 Beer-Sheva 8410501, Israel
Interests: computer vision; machine learning; plant phenotyping; deep learning aspects: explain-ability; efficient inference; modular networks

Special Issue Information

Dear Colleagues,

Plants are the fundamental source of food for people, livestock, and all live species on earth. The growth in the human population, with 10 billion people expected by 2050, requires an increase of 50% in agriculture production. Crop optimization is approached by multiple means, including automation of agricultural operations and improved plant breeding process, creating an urgent need for plant trait analysis and phenotyping. However, manual plant analysis is tedious, often destructive for the plant, and non-scalable. With improved sensors and recent advances in machine learning (especially deep learning), imaging-based plant analysis provides a promising alternative, with a growing impact.

This Special Issue invites cutting-edge contributions concerning all aspects of the imaging-based plant analysis challenge. Image acquisition is one topic that is of particular interest. Agriculture monitoring is done in complex and changing illumination conditions, often with modalities beyond RGB, such as depth or hyperspectral data. Papers considering illumination condition, illumination design, and joint illumination, as well as acquisition algorithms, sensor fusion, or image processing design, are encouraged. Another topic of interest is image perception—computer vision and machine learning techniques applied to plant analysis from images. Novel phenotyping tasks, as well as methods for improved accuracy, and/or robustness in existing tasks are welcome. Additional topics of interest include (but are not limited to) fine-grained phenotyping, flexibility and task transfer, and phenotype tracking in a time series. Furthermore, people wishing to discuss a topic of particular interest, to outline the next steps and challenges, are welcome to submit review/survey papers.

Dr. Mario Valerio Giuffrida
Dr. Iftach Klapp
Dr. Aharon Bar-Hillel
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All papers will be peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Remote Sensing is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2400 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Plant phenotyping and acquisition
  • Computer vision
  • Machine learning/deep learning
  • Precision agriculture
  • Multi-modal imaging and sensor fusion
  • Joint illumination and image processing design
  • Acquisition and phenotype tracking in time series

Published Papers (4 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Article
Parts-per-Object Count in Agricultural Images: Solving Phenotyping Problems via a Single Deep Neural Network
Remote Sens. 2021, 13(13), 2496; https://0-doi-org.brum.beds.ac.uk/10.3390/rs13132496 - 26 Jun 2021
Cited by 2 | Viewed by 542
Abstract
Solving many phenotyping problems involves not only automatic detection of objects in an image, but also counting the number of parts per object. We propose a solution in the form of a single deep network, tested for three agricultural datasets pertaining to bananas-per-bunch, [...] Read more.
Solving many phenotyping problems involves not only automatic detection of objects in an image, but also counting the number of parts per object. We propose a solution in the form of a single deep network, tested for three agricultural datasets pertaining to bananas-per-bunch, spikelets-per-wheat-spike, and berries-per-grape-cluster. The suggested network incorporates object detection, object resizing, and part counting as modules in a single deep network, with several variants tested. The detection module is based on a Retina-Net architecture, whereas for the counting modules, two different architectures are examined: the first based on direct regression of the predicted count, and the other on explicit parts detection and counting. The results are promising, with the mean relative deviation between estimated and visible part count in the range of 9.2% to 11.5%. Further inference of count-based yield related statistics is considered. For banana bunches, the actual banana count (including occluded bananas) is inferred from the count of visible bananas. For spikelets-per-wheat-spike, robust estimation methods are employed to get the average spikelet count across the field, which is an effective yield estimator. Full article
Show Figures

Graphical abstract

Article
Novel 3D Imaging Systems for High-Throughput Phenotyping of Plants
Remote Sens. 2021, 13(11), 2113; https://0-doi-org.brum.beds.ac.uk/10.3390/rs13112113 - 27 May 2021
Viewed by 966
Abstract
The use of 3D plant models for high-throughput phenotyping is increasingly becoming a preferred method for many plant science researchers. Numerous camera-based imaging systems and reconstruction algorithms have been developed for the 3D reconstruction of plants. However, it is still challenging to build [...] Read more.
The use of 3D plant models for high-throughput phenotyping is increasingly becoming a preferred method for many plant science researchers. Numerous camera-based imaging systems and reconstruction algorithms have been developed for the 3D reconstruction of plants. However, it is still challenging to build an imaging system with high-quality results at a low cost. Useful comparative information for existing imaging systems and their improvements is also limited, making it challenging for researchers to make data-based selections. The objective of this study is to explore the possible solutions to address these issues. We introduce two novel systems for plants of various sizes, as well as a pipeline to generate high-quality 3D point clouds and meshes. The higher accuracy and efficiency of the proposed systems make it a potentially valuable tool for enhancing high-throughput phenotyping by integrating 3D traits for increased resolution and measuring traits that are not amenable to 2D imaging approaches. The study shows that the phenotype traits derived from the 3D models are highly correlated with manually measured phenotypic traits (R2 > 0.91). Moreover, we present a systematic analysis of different settings of the imaging systems and a comparison with the traditional system, which provide recommendations for plant scientists to improve the accuracy of 3D construction. In summary, our proposed imaging systems are suggested for 3D reconstruction of plants. Moreover, the analysis results of the different settings in this paper can be used for designing new customized imaging systems and improving their accuracy. Full article
Show Figures

Figure 1

Article
Visual Growth Tracking for Automated Leaf Stage Monitoring Based on Image Sequence Analysis
Remote Sens. 2021, 13(5), 961; https://0-doi-org.brum.beds.ac.uk/10.3390/rs13050961 - 04 Mar 2021
Viewed by 730
Abstract
In this paper, we define a new problem domain, called visual growth tracking, to track different parts of an object that grow non-uniformly over space and time for application in image-based plant phenotyping. The paper introduces a novel method to reliably detect and [...] Read more.
In this paper, we define a new problem domain, called visual growth tracking, to track different parts of an object that grow non-uniformly over space and time for application in image-based plant phenotyping. The paper introduces a novel method to reliably detect and track individual leaves of a maize plant based on a graph theoretic approach for automated leaf stage monitoring. The method has four phases: optimal view selection, plant architecture determination, leaf tracking, and generation of a leaf status report. The method accepts an image sequence of a plant as the input and automatically generates a leaf status report containing the phenotypes, which are crucial in the understanding of a plant’s growth, i.e., the emergence timing of each leaf, total number of leaves present at any time, the day on which a particular leaf ceased to grow, and the length and relative growth rate of individual leaves. Based on experimental study, three types of leaf intersections are identified, i.e., tip-contact, tangential-contact, and crossover, which pose challenges to accurate leaf tracking in the late vegetative stage. Thus, we introduce a novel curve tracing approach based on an angular consistency check to address the challenges due to intersecting leaves for improved performance. The proposed method shows high accuracy in detecting leaves and tracking them through the vegetative stages of maize plants based on experimental evaluation on a publicly available benchmark dataset. Full article
Show Figures

Graphical abstract

Article
Deep Learning in Hyperspectral Image Reconstruction from Single RGB images—A Case Study on Tomato Quality Parameters
Remote Sens. 2020, 12(19), 3258; https://0-doi-org.brum.beds.ac.uk/10.3390/rs12193258 - 07 Oct 2020
Viewed by 1563
Abstract
Hyperspectral imaging has many applications. However, the high device costs and low hyperspectral image resolution are major obstacles limiting its wider application in agriculture and other fields. Hyperspectral image reconstruction from a single RGB image fully addresses these two problems. The robust HSCNN-R [...] Read more.
Hyperspectral imaging has many applications. However, the high device costs and low hyperspectral image resolution are major obstacles limiting its wider application in agriculture and other fields. Hyperspectral image reconstruction from a single RGB image fully addresses these two problems. The robust HSCNN-R model with mean relative absolute error loss function and evaluated by the Mean Relative Absolute Error metric was selected through permutation tests from models with combinations of loss functions and evaluation metrics, using tomato as a case study. Hyperspectral images were subsequently reconstructed from single tomato RGB images taken by a smartphone camera. The reconstructed images were used to predict tomato quality properties such as the ratio of soluble solid content to total titratable acidity and normalized anthocyanin index. Both predicted parameters showed very good agreement with corresponding “ground truth” values and high significance in an F test. This study showed the suitability of hyperspectral image reconstruction from single RGB images for fruit quality control purposes, underpinning the potential of the technology—recovering hyperspectral properties in high resolution—for real-world, real time monitoring applications in agriculture any beyond. Full article
Show Figures

Graphical abstract

Back to TopTop